Business analytics is a hot topic these days. It helps drive evidence-based decisions, allowing organizations to compete and serve better. However, to be successful, organizations need to manage a myriad of people, processes and, of course, technology. IT leaders are under increasing pressure to deliver in environments that thrive on:
a) “Big Data” (the ability to manipulate and manage very large data sets and storage facilities).
b) Complex data from text, social media and traditional operational systems.
c) Challenging analytical tasks that push systems to the limits.
d) Pressure to deliver results quickly.
Recent customer surveys and industry observations suggest IT and business analytics professionals don’t feel as prepared as they should because their infrastructure is not flexible enough and has not kept up with market changes and speed. IT lacks sufficient computing power to support the large variety of business analytics tasks designed to gain business insights quickly and execute those insights where it counts.
Whether performing massive calculations in insurance, credit risk analysis in retail banking, markdown optimization in retail, firmwide liquidity management and stress testing, or store-specific product assortment planning, business units are looking to IT leadership to help them make faster, more accurate decisions that can tip the balance toward growth in this turbulent economy. In the tug of war for resources, IT leaders need to take note of the following key business analytics platform requirements to support lines of business.
Speed: Execute business analytics activities faster
Speed and timely availability of analytic results should not be restrained by the scalability issues. For example, analytic end users should not have to compromise on the type of new data sources they can tap into or be restricted to asking only simple questions. In order to support these iterative analytic tasks, IT leaders should look to reduce data latency and unnecessary data movement for business analytics. In the era of Big Data, in-database processing helps to leverage an integrated database/data warehousing and business analytics footprint.
More data, users and applications
Demands on business analytics systems are increasing with larger numbers of business units embracing the need for fact-based decision making in every aspect of the business. With more business need comes increased user numbers, data explosion and analytical application demand. IT needs to help create a high-performance business analytics computing environment that can guarantee uptime and high availability as well as speed and agility.
To support high availability, speed and agility, IT should look to improve hardware utilization and scale-out. Making full use of high-performance computing technologies will allow IT to use idle capacity or support new applications (as needed) to commodity hardware in an incremental fashion, reducing total costs and providing strong support for the enterprise business analytics environment.
Some of the key technology options for high performance business analytics environments that IT leadership should keep in mind include:
Grid computing makes it easier to accommodate compute-intensive applications and growing numbers of users to appropriate hardware resources for getting quick results and improved business continuity. Customers get a managed, shared environment to process large volumes of analyses more efficiently. Using a grid-based business analytics environment, a financial services firm was able to reduce probability of loan default calculation time from 96 hours to just four. Plus the users get a platform on which to schedule and prioritize key operational analysis tasks, so that ad-hoc or discovery-based tasks do not compete for limited resources.
In-database computing moves the relevant data management, analytics and reporting tasks closer to where data is stored and lets computations run inside the corporate data store. It reduces unnecessary data movement and promotes better governance while enhancing speed to solution. A leading shopper-driven marketing solutions firm successfully moved the company’s scoring of purchase behavior models into its database for faster processing. As a result, models that used to take half a business day to process can now be scored in 60 seconds. With this performance boost, the company expects to be able to develop and test 600 models per year with the same staff that used to deliver 40 to 50 new models per year.
In-memory analytics allow computations to be performed in-memory in a specialized grid configuration to distribute analytical computations in parallel across a dedicated set of servers to yield faster response times and deliver quick insights. A leading bank, an early evaluator of an in-memory solution, was able to value a complex portfolio of more than 44,000 financial instruments in less than three minutes.
Event stream processing is gaining popularity to analyze data in present, real-time context and take immediate decisions. In this case, analytics are applied to the event streaming data from multiple sources, even before it gets stored, to detect patterns and decipher by looking at “time” dimension. In capital markets, traders want to detect variance that hints at short-lived risks and opportunities to either hedge or exploit. Other examples include national security intelligence and real-time fraud detection.
As IT leaders, it’s imperative that we provide robust technology platforms that align with the needs for business analytics. We must build for today and the future with a high-performance computing strategy for business analytics.