The disastrous global financial crisis put a spotlight on the need to get rapid insights from big data. That effort, in turn, will accelerate much-needed improvements in firms’ information management practices. The data guys are getting their say at the corner office and getting the budget to drive many of these regulatory initiatives,” noted Larry Tabb, Chief Executive Officer of the TABB Group, a financial markets research and advisory firm.
The technologies implemented for compliance have the added advantage of driving business improvements. If done right, it’s a rare kind of two-for. Under pressure for greater transparency and risk awareness than ever, firms stand to gain a lot from the data management revolution.
Optimization is no longer a quarterly or monthly reporting cycle; it is an activity of responding quickly to market, capital and risk factor changes as they happen. “Increasingly, we hear that clients are trying to obtain an array of risk metrics in more real time, released multiple times during the day, not just as an end-of-day or over the weekend,” said Tabb.
The capital markets already have a great deal of sophistication in dealing with data, but the velocity – not just the volume – presents challenges. This is what people really mean when they talk about big data. For example, high-frequency trading generates an estimated 6.5 million messages per second on US exchange products. A midsize capital market firm manages 20 petabytes of data. To put it in perspective, a petabyte is a quadrillion bytes; Google processes about 24 petabytes of data per day.
In addition, capital market firms have these challenges:
- Analyzing unstructured data, such as those millions of tweets potentially yielding clues as to whether you should be buying X or selling Y.
- Acquiring clean data, keeping acquired and internal data clean and managing it in a way that doesn’t create redundancies.
- Providing accelerated analysis of portfolio-level market, credit and liquidity risk.
- Offering up-to-the-minute assessments of risk exposures for large, complex portfolios of financial instruments and rapidly analyzing (in near-real time) incremental value at risk (VaR), counterparty exposures and liquidity measures.
- Aggregating risk exposures to interactively analyze, explore and drill down to business unit, desk, portfolio, instrument or horizon.
- Dynamically, and interactively, stress testing to anticipate the impact of extreme events on portfolio values
The pressure for more immediate answers requires a different kind of technology framework, one that can recalculate exposures based on emerging, dynamic risk factors – and distribute the results in real time.
The right technology foundation delivers:
- An integrated, visual view of data from multiple, disparate systems, with assured data quality and consistency, and without the requirement to create one enormous database or to forklift the data around for analysis.
- Faster processing speeds – by several orders of magnitude, using in-memory analytical techniques – so you get key risk results within the time windows that critical decisions demand.
- Considerably greater precision in extremely complex portfolio valuations, versus overly simplistic approximations.
- Always up-to-date portfolio views of aggregated risk by accommodating incremental arrival times for data relating to portions of the portfolio.
- The right level of detail with granular access control – from executives needing high-level summary views, to analysts needing to drill into the most granular details in response to regulatory inquiries.
Firms stand to gain a lot from the data management revolution. By capturing opportunities and value that would otherwise be lost because the information came too late or didn’t account for late-breaking market conditions, you might be able to see threats ahead of time and capitalize on more agile planning.
When you’re ready for real-time decision making, take a look at SAS.
NOTE: Originally published in Wall Street Technology, Feb 2013.