Big data is a game changing opportunity for financial services companies. McKinsey Global Institute’s June 2011 report, Big data: The next frontier for innovation, competition, and productivity, estimated that US banks and capital markets firms together had more than an exabyte of stored data in 2009. That much data makes a unified data management system – one with high-performance analytics to grow revenue, reduce risks, prevent fraud and meet regulations – imperative.
The tumult in the financial markets shows no signs of slowing, as Europe’s debt crisis unfolds and the US recovery remains fragile. For retail banks, industry analysts estimate that the cost of recent regulations, combined with continued low interest rates, could reduce retail bank revenues by 30 to 50 percent, according to a December 2011 BAI Executive Report. Meanwhile, the technology needed to manage risk and regulation continues to chew up 15 percent of IT investments. Financial services companies will struggle in 2012 to find the optimal channel mix to deliver value to clients.
Getting big data right
The risk side of banking is all about controlling costs and risk – consider market, counter-party credit or liquidity risks; reputational harm; or government fines that affect financial health or solvency. Data integration and quality are paramount.
A risk analytics data model defines instruments, positions and counterparties along with market data, risk factors and models to compute risk exposures. It also supports stress testing and scenario analysis. With risk data housed in a unified repository, it is much easier to analyze market and credit risks, asset-liability management and liquidity risks. Aggregating risks across all portfolios provides a complete risk picture to the firm. This will help the executive committee and boards of directors understand total firm exposure and how that compares to the firm’s risk appetite.
One global bank implemented such techniques to perform regulatory and capital calculations and regulatory reporting at the group level. The bank processes more than 100 million rows of data per month, along with a reporting repository of more than 5 billion rows. The firm’s single version of the truth now encompasses both risk and finance, helping to close the books faster. A unified data model and repository will help any firm meet the challenges of Basel III identified by the Global Association of Risk Professionals.
Full balance sheet risk analysis for assessing liquidity also demands integrating big data from multiple locations. Without integrated data, long calculation times can stretch to hundreds of hours, inhibiting timely decision making. An Asia Pacificbased bank tested high-performance analytical techniques to calculate a range of liquidity risk measures. It analyzed a portfolio of 30 million complex cash flow instruments across 50,000 different scenarios in less than eight hours. The ability to fully revaluate liquidity risk nightly ensures informed funding decisions, even in times of market volatility.
What does that mean? By quickly determining exposure, portfolio value at risk and liquidity coverage, the firm can determine products to take to market or markets to exit much faster. It can fine-tune responses to changes in interest rates, exchange rates and counterparty risk to remain competitive.
Imagine the advantage to a large US bank that reduces loan default calculation time from 96 hours to just four for a portfolio of more than 10 million mortgages. The bank can detect high-risk accounts much more quickly to forecast losses and hedge risk – plus make decisions about further lending.
One of the key considerations in using analytics better to understand risk resources: Can you do it faster without staff increases? One large UK-based bank slashed model development time from 13 weeks to six – enabling it to build twice the number of models with the same staff.
But keep in mind, the firm had already integrated data sources onto a platform that allowed them to stop spending enormous amounts of time manipulating instead of analyzing data. While the morale boost alone helped with retention, the more measurable effect was significant improvement to collections and recovery.
Real-time risk assessment
The proactive element of understanding risk is critical. A large Canadian bank wanted to use 12 years of monthly account-level credit card data, credit bureau information and bank account information to better assess the risk before granting loans or raising credit limits. Ideally, it wanted this information in real time. To speed the computing, it used an in-database approach. When analytics work within the database, data doesn’t need to be extracted, transformed and loaded. As a result, the bank could calculate risk 70 times faster.
With credit cards, proactive analysis can spot fraudsters before they run up thousands of dollars in fake charges – and stop them without inadvertently denying a legitimate purchase. A large global bank uses high-speed, real-time analytics to determine at the point of sale whether the purchase is legitimate.
The bank is so enthusiastic about the reduction in fraud losses that it has expanded the analytics to look at customers’ online banking transactions. The intent is to build a more accurate profile a given customer’s “normal” – and what is a tip-off that an account has been compromised.
The challenge for the years ahead will be to balance increased regulatory costs and the need for greater efficiency while at the same time delivering an improved customer experience and innovations to retain customers and grow revenues. High-performance analytics will help bank executives pave the path to success – with customers, regulators and in the market.
Download the special report Big data, bigger opportunities to read more articles on high-performance analytics.