As a result of the recent market shocks, banks, capital markets, firms and asset managers are rethinking risk management strategy. In this series we are discussing the need for a new risk management framework. In The evolution of risk management, we pointed out that traditional risk management techniques failed during the financial crisis that began in 2008, and now more sophisticated analytics make traditional quantitative techniques more transparent and available to decision makers. By incorporating all elements of the risk and reward equation – exposures, return, capital reserves, capital deployed in various forms, firm liquidity and market liquidity – we now have the opportunity to provide and grow high-performance risk management capabilities within firms.
In Quantitative math or management analytics, we focused on those issues recognized by banks and financial firms as key problems leading to the crisis. What were the problems and how could they be avoided with careful planning for the future – even if there was never a situation exactly like this again?
This is the fourth installment of our seven-part series. In it, we are going to concentrate on the risk management framework. In order for firms to be successful going forward, the risk management paradigm will shift from discrete, quantitative exercises to an overall management analytics view. This paradigm shift is paralleled by a shift in the nature of computing and analytics. Banks, capital markets firms and asset managers have relied on traditional transactional and reporting technology since the early mainframe systems were deployed in the 1970s. Quantitative measures – and the associated calculations that emerged in the 1980s and 1990s – started to extend the analytics framework beyond mere transaction reporting, but the vision of a predictive analytics framework has eluded firms.
Today’s results are yesterday’s news
One reason for the shift is the concept that – to solve the cross-correlation of market, portfolio and event data – a firm must combine as much transaction data as possible into one central repository.
Additionally, due to the limitations in computational technology, expert judgment weighed heavily in the selection of prime factors for the calculation of exposures to reduce the calculation time to one or two days. Finally, senior management was not able to gain a firmwide view of risk due to the fact that any of the resultant exposure calculations were then reassembled into a large set of analysis reporting cubes. Results that took days to calculate were then further obscured by being summarized into dimensional analysis cubes that required a dedicated team of specialists to interpret and report.
Because of this, senior management traditionally has only gotten an outdated and highly summarized view of risk across all portfolios. Prior to the introduction of recent advances in high-performance computing environments that allow for the dynamic monitoring of positions, markets, information events, measurement and tracking of capital movements, the ability to monitor all of these components has eluded financial firms.
A technology framework that enables the recalculation of a set of exposures based on new, dynamic risk factors – with the ability to distribute the results in real time – is what leading firms are seeking. To extend dynamic monitoring to a truly predictive analytic process, firms have also had to adopt advances in analytic modeling techniques and embrace the notion of dynamic event interaction into the analytic models.
The future is now
Inferences about how market events are affecting markets and portfolios in real time cannot be made with traditional summary-reporting technology.
This technology approach simply cannot keep pace. Expert judgment in front-office management roles, middle-office monitoring functions or capital funding desks can only be augmented in a meaningful way by incorporating predictive analytics. Predictive analytics can only be truly implemented with a technology environment that computes cross-portfolio, market, risk tolerance and capital optimization factors in an on-demand context. Traditional reporting and database storage technology does not do this. High-performance computational environments that have the ability to store results to multiple, parameterized what-if scenarios can. With the ability to execute multiple scenarios on demand across a large factor set, expert decision makers can quickly reduce all available information to the core set needed for optimization. Without this ability, firms must continue to guess based on historical experience, assuming that results will (in most cases) revert to an expected mean or outcome.
Being able to process large quantities of information in real time and immediately analyze the results changes the paradigm to an adaptive learning scenario. The ability to make decisions on where to allocate capital and exit existing risk versus return paths is what gives firms that first adopt this type of technology a competitive advantage. Expert judgment can be applied within the new construct of adaptive learning, based on real-time analysis of events to accelerate the response to market events. SAS has focused its latest research and development efforts on the creation of optimized computational platforms for risk that are also optimized for calculation of risk, optimization of capital and the measurement of market events and liquidity.
In our next article, we begin laying practical building blocks to this framework. Our fifth article, “Risk management is capital optimization,” proposes that firms take a broader view of risk including optimization of return, management of capital and impact on market price of liquidity. And if you missed any of the previous essays in the series, now is a good time to go back and read them: