In the capital markets world, money never really changes hands; it only passes through them and then only digitally. These markets are fast and vast, complex networks of extremely large amounts of monies or goods. In one single day on the London stock market, more than 700,000 trades may occur at a value of over £4 billion. In this environment understanding the Value at Risk (VaR) becomes a complex, timely problem.
One large UK bank decided it was time to evaluate their risk across all of their trading books. Being able to consolidate this VaR – and get the data within minutes – would give the bank an incredible edge over the competitors who continue to trade on individual books and are working from VaR data as much as two days old.
As with any risky, fast paced environment, the right information at the right time makes all the difference; even if that environment includes thousands of orders every second. Some of the current problems faced by this bank are:
- Reports are not generally available until one day after the transaction (T+1).
- No ability to readily understand the timeliness of the underlying data being presented.
- Data from one processing system to the next have no commonly agreed upon definition.
- Data is aggregated with no drill-down capability.
- No commonly agreed upon official snapshot of trades or reference data for consistent enrichment.
- No understanding of what input parameters are used for the underlying calculation.
- Spikes or exceptions in the data are not automatically identified.
- Users cannot create watch-lists of specific trades or subjects of interest.
So the bank decided to deliver a high performance, automated VaR calculation and reporting process for market risk managers. This would allow the managers to spend more of their time evaluating risk rather than investigating data discrepancies. Of course, this was a major undertaking. To do this, they needed to focus on four major issues:
The basic problem the bank faced was the lack of access to relevant data and the inability to analyze the data appropriately. With no standardized view of the day old data, it was just too hard to make accurate, timely decisions.
To overcome these challenges, the bank built a system that would allow them to aggregate the data on the fly into a single repository. So instead of waiting until the following day to get the latest reports, the risk management team can have an up-to-date view of their risk exposures on an intraday basis using the latest blended view of interim and official risk figures to help inform the business decisions in a timelier manner. It was then possible to use drill-down graphing and analytic tools to make sense of this complex environment – all before the closing bell – and change course if the VaR exceeded their risk appetite.
For this to become a reality, it was important to leverage two new technologies. The first is the Event Stream Processing (ESP), which allows on-the-fly decisions in milliseconds. The second technology is a high performance risk solution, which uses commodity hardware to distribute processing across a grid of computers thus allowing access to this granular data in seconds.
ESP allows complex business decisions to be made on massive amounts of data in real-time. The bank can now evaluate every trade – singularly – before the deal is made, and then accumulate it with other current trades to evaluate trends. Finally, the bank can synchronize this data with the global corporate data. In fact, this technology means the bank has:
– Continuous queries on flowing data (with incrementally updated results).
– Very low (max) event processing latencies (i.e., usecs-msecs).
– High volumes (>100K events/sec) of data evaluated.
– Derived event windows with retention policies.
– Memory constrained for performance (i.e., bounded state).
– Predetermined data mining, decision making, alerting, position management, scoring, profiling, etc.
– Event out-of-order handling to ensure ordered source streams.
But it turns out VaR is no longer an adequate measure of risk. Nor is it capable of informing senior management of the true nature of their market and credit risk exposures.
For full transparency and the level of detail required by the business, many different types of analyses are now required including, but not limited to: eVar, sVar, transient concentrations, extreme value, abnormal correlations and stress tests.
Advanced, high performance analytics lets risk managers quickly and easily explore a wide range of scenarios to help both senior management and the business be better informed about the true nature of the exposures on their trading books.
Learn more about the possibilities available with event stream processing.