Riding the avalanche of regulation

Moving from a basic state to a strategic state with a new data architecture

By Michael Giarrusso, Partner, Financial Services Office Advisory, EY

Banking has entered a new era of regulation. Following the bailouts of the recent financial crisis, regulators have increased their emphasis on accountability. But the largest firms in financial services can only be held accountable if their activities are easily tracked by regulators, investors and other counterparties. For this reason, accountability goes hand in hand with transparency.  

A vast number of regulatory initiatives have been put in place or are in the process of adoption, collectively requiring banks to make massive investments to improve both the quality of their data and its accessibility both to supervisors and shareholders. To comply with the new rules, banks will have to spend hundreds of millions — and in some case billions — of dollars.

The demands from regulators on data quality and disclosure are burdensome, but they also present an opportunity. Banks taking the right approach to this will do more than simply comply with the rules.

Those opting to meet the requirements with a well-planned, integrated data strategy will also improve their ability to generate revenue and manage risk — gaining a competitive advantage over those that do not.

High-quality data allows firms to outperform peers in terms of risk management and other areas of the business.

The consequences of failure

While I firmly believe that the carrot of commercial advantage is critical, the starting point for banks must be the consequences of failing to meet the demands of regulators on data.

These are not limited to additional costs in the form of fines: Banks can actually be prevented from making disbursements, putting investors directly in the firing line for their deficiencies.

In the US, the Fed considers data quality and governance as part of its annual bank stress test, the Comprehensive Capital Analysis and Review (CCAR). Firms have failed the qualitative aspect of the test, which measures the internal processes used by a bank to determine the amount and composition of capital it requires to survive a period of severe stress.

This failure meant the firms could not deploy capital or increase dividends, with obvious consequences for the firms’ standing with investors.

Under Pillar II of the Basel III capital requirements, data shortfalls can also result in additions to the basic level of capital a bank is required to hold. This is a material risk to firms: Assuming a 10 percent cost of equity capital, a $1 billion addition to existing capital holdings costs $100 million a year.

Moreover, ongoing data deficiencies raise significant systemic risk concerns among regulators. Firms failing to comply with data requirements face the prospect of regulatory-mandated restructuring: a chilling prospect for any commercial entity with its own strategy for profitability.

                    Five best practices to improve data quality

                    Data quality is an essential component of your new strategic data architecture. Follow these five tips to help improve the quality of your data:

                    • Align data with risk management and regulatory requirements.
                    • Make the quality of data risk management transparent. 
                    • Create business rules for sustainable data quality improvement.
                    • Establish continuous monitoring to measure success.
                    • Implement end-to-end analysis of risk processes.

                    Scope of oversight

                    Regulators are placing increased emphasis on the broader regulatory control and risk management framework of financial institutions, where data completeness and consistency are key components.

                    A number of initiatives require banks to make massive investments to improve data quality. These include:

                    • The Basel Committee for Banking Supervision’s principles for effective risk data aggregation and risk reporting (BCBS 239). In a recently released progress report, the BCBS noted that many banks are unlikely to meet the Jan. 1, 2016, deadlinefor compliance.
                    • Vastly increased reporting requirements to demonstrate compliance with activity restrictions, such as the Volcker Rule, or the maintenance of adequate liquidity under the Basel Committee’s liquidity coverage ratio.
                    • Data on counterparty exposures.
                    • Granular information about the credit quality of individual loans, which is needed for stress tests.


                    Shifting to a strategic state

                    To address these challenges, banks must migrate to a “strategic state” data architecture. Firms will inevitably incur onerous costs and place a strain on resources in the process, but making the switch in a well-planned and integrated way is key to reaching that state and mitigating costs in the process.

                    Today, most banks’ data infrastructures are largely in what we call a “basic state,” defined by complex legacy architecture predicated on silos for each function such as risk, regulation or finance. Each silo requires multiple reporting sources and tools.

                    This leads to a large volume of reconciliations, which are often duplicated and inconsistent, as well as an incomplete transaction population. The consequences for organization and governance are severe: limited accountability and ownership, functional silos, decentralized governance and inconsistent or limited policies.

                    These problems can’t be solved in a day, so we envisage an interim “enabled state,” maintaining basic state infrastructure during the transition to a strategic state, with new processes and governance put in place.

                    The longer-term goal, the strategic state, consists of a streamlined systems architecture built around a centralized database. Data is integrated across all domains, including risk, regulation and finance. There will be a single source for all reporting.

                    This will minimize the volume of reconciliations, with the enabled state framework used as an input for the design and requirements of the strategic state architecture. The established organizational and governance structure will be carried over from the enabled state, with minor modifications to support strategic processes and technology.

                    The business case

                    This is not simply a question of conforming to regulation; an integrated data architecture offers many business benefits. The interests of management and shareholders are aligned with those of regulators.

                    In the context of pure profitability, getting it right on data is critical. EY’s analysis suggests that regulatory implementation is likely to cost around 3 percent of return on equity (RoE). In the new, low-RoE environment for banks (roughly 9 percent on average for North American banks and lower for European banks in 2014), these numbers are meaningful. Early implementation of a strategic state data architecture will deliver cost efficiencies that could help offset this downward pressure on RoE in the long run.

                    As the recent CCAR stress tests have demonstrated, taking control of data maintains the ability for banks to make distributions and restores it for those that have fallen short on regulatory expectations. Addressing data concerns also reduces or even eliminates the rationale for capital add-ons, and in this context the resulting savings should be seen as an offset to the cost of data systems.

                    CFOs, treasurers and CROs are under pressure from all sides on data: Senior management, boards and investors all require consistent disclosures across all stakeholders. Appetite is also growing for risk-adjusted performance analysis, projections and accurate, risk-based pricing for financial products.

                    On the investor side, the demand for disclosures from investors is growing in intensity and detail. With the Financial Stability Board’s forthcoming total loss-absorbing capacity requirements, investors will be looking to banks to provide key data before they consider investing in instruments subject to a potential bail-in. This applies both to a respective bank’s capital position, as well as the capital instruments themselves.

                    But the rationale for making the investment to enter a strategic state data architecture is not simply about adapting to a new normal. High-quality data allows firms to outperform peers in terms of risk management and other areas of the business. Therefore, devoting resources to data architecture is not just a cost, but also an investment.


                    Michael-Giarrusso

                    Michael Giarrusso is a New York-based partner in EY’s Financial Services Office Advisory practice. He leads the Enterprise Regulatory Transformation team, advising multiple clients on regulatory requirements related to enhanced prudential standards and the establishment of enterprise program management capabilities. Giarrusso has more than 14 years of experience in the financial services industry, serving banking and capital markets and insurance clients in the areas of risk and regulatory change, enterprise risk governance, credit risk and capital management.

                    building-column

                    Read More

                    • Traditional data quality approaches often fall short when it comes to managing big data. Watch this data quality webinar for a highly effective approach and three use cases from today's business world.
                    • Glean best practices from banks using bank-specific stress tests to run their business smarter, safer and more profitably. Download the white paper.

                    Get More Insights

                    iPad

                    Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.

                    Back to Top