Hitting the mark with big data

XL Insurance Group talks about its visual analytics initiative

Insurers have long seen data as a source of competitive advantage. But data alone is worthless – it's the insights derived from the data that matter, says Kimberly Holmes, Senior Vice President of Strategic Analytics with XL Insurance Group's Bermuda office. And with the emergence of big data, she notes, the possibility for deriving insights is increasing dramatically.

Yet, for those insights to have an impact on the business, they have to have the attention of senior underwriters. "The data analytics means nothing without the decision makers embracing it," Holmes insists. "We see a lot of our competitors create models that don't have an impact because the underwriters don't use them."

sascom-2013q2-insurance

To foster a more collaborative approach to the analysis of large volumes of data, XL ($45.1 billion in total assets) currently is implementing SAS Visual Analytics technology. Holmes describes the solution as a powerful communication tool for creating a partnership of exploration between the analytics team and underwriters. She says the tool will bring the expertise of the decision makers and other stakeholders more deeply into the analytics process by demonstrating the meaning of data more readily and inspiring further exploration and insight.

"It's really a demonstration of the expression that 'every picture is worth a thousand words,'" Holmes adds. "The key to getting people to embrace new insights and change how they make decisions is that they believe in their gut that this insight is true."

Rewriting the rules of the game

Holmes asserts that the world of insurance is changing exponentially as volumes of available data rapidly expand and sources of data proliferate. As a result, roles within the insurance enterprise will change, along with the terms of competition. "Commercial insurance will become more efficient by creating more automation in decision making and how we access our customers," she predicts. "Those changes will happen more rapidly in smaller-account business, but we need the right technology and data to take advantage of that."

Holmes characterizes XL as one of the few carriers in the commercial insurance domain to act on this vision. "We expect investments such as SAS Visual Analytics to create enormous competitive advantage and shareholder value for XL," she says.

According to Celent Senior Analyst Benjamin Moreland, though, few insurers are ready to talk openly about their big data-related initiatives. The world of big data constitutes a paradigm shift for carriers, many of which continue to struggle with issues in their traditional transactional data.

"Carriers continue to have trust issues with internal data," Moreland says. "Many insurers are not used to using data for operational status and decision support because of their skepticism. Also, business-line-specific data orientation has resulted in inconsistencies in reports, leaving C-level officers to ask, 'Which report should I believe?'"

Insurers that can take advantage of large amounts and types of data early on will be able to do better on pricing and customer segmentation, Moreland says. Their challenge will be driving data into the decision-making process. "Senior leadership often makes decisions on anecdotal evidence," he notes. "Their instincts may be strong, but they have to determine the worth of those instincts based on whether the data supports it."

Big data isn't just a matter of the volume and source of data, but also the speed at which it is processed, Moreland says. In the past, carriers could crunch numbers over time, distribute reports and then make decisions. Today many more decisions need to be made in or near real time, whereas traditionally underwriters may have reviewed overexposure in a given area in hindsight. "The task for IT today is to bring opportunities to underwriters and other decision makers to support decisioning as events are happening," Moreland says.

Effective handling of big data, he suggests, will also enable an increasing range of automated underwriting decisions in near-real time, such as preventing the writing of new business in an area with the potential to be struck by a developing weather event.

Martina Conlon, a principal at Novarica, suggests that company size will be a factor in how – and how quickly – a given insurer will adopt big data-related capabilities for underwriting and other purposes. Larger insurers, she says, have made far more progress in the use of large volumes of data – for example, telematics, geo-spatial data, mobile information, social media data, automated information from weather services and even clickstreams of visitors to their websites.

"Very large carriers are leveraging big data and operationalizing automated analytics in their business processes," Conlon says. "Below the top tier, most are dealing with more basic issues, such as implementing solid core systems and trying to establish a baseline business intelligence infrastructure for an integrated view of their data, as well as trying to marry-in structured external data."

Conlon says the greatest big data barrier for small carriers is cost of entry, as both initial costs and maintenance are high. Second-tier and smaller carriers also struggle to find the right talent to adopt big data capabilities. "Lower-tier carriers don't have the resources to determine whether such initiatives are worth it. Bigger firms can afford to invest in the analysis to make the business case."

Vendors will help smaller carriers punch above their R&D budget weight, suggests Conlon's Novarica colleague Greg Wittenbrook. Vendor products will begin including big data-related functionality or enabling capabilities, Wittenbrook says, and more data providers will emerge.

Life-changing technology

Whereas personal-line P&C insurance has led the big data trend, commercial lines are approaching a breakout stage. Capgemini Vice President Tony Pavia predicts that commercial insurers that embrace big data will gain market share at the expense of laggards. But Pavia says big data also is driving changes in the life insurance industry. "Term life underwriting is dramatically different than it was five or six years ago, because of the use of aggregators of data," he says. "Across the industry, carriers are going to fall behind because they lack the culture to embrace change. Over the next five years you'll see a real separation between those that are rapidly migrating to new environments and those that are not."

Wittenbrook notes that life insurers are looking for new opportunities to use data to decrease the cost and intrusiveness of life insurance underwriting, which involves the administration of various medical tests. Insurers may be able to take advantage of the general trend of consumers to share personal data in exchange for discounts, as well as the move to broader medical records and the availability of social media data. "However, there is a creepiness factor related to an insurer knowing too much about a client," he cautions, "and insurers also need to deal with constantly changing laws and regulations governing what they can and cannot access."

Prudential Financial ($961 billion in assets under management) is sensitive to the "creepiness factor" and is steering clear of it, according to Mike McFarland, VP of Underwriting in Prudential's individual life insurance business. "There are potential components of predictive modeling that some people find disturbing, but we are not doing that; we're using traditional risk points, but in a different way," McFarland says. "We're looking for more economical ways to issue life insurance, which is expensive because it utilizes this very expensive resource we call an underwriter."

McFarland refers to the emergence of vendors that collect traditional types of lab data and perform risk analysis through a scoring system. But other sources of data also are emerging, such as those that can predict the likelihood of diseases for individuals of a certain age. "Data that used to be hard to get is now aggregated and available in a very usable form," he says.

Still, McFarland says Prudential is moving cautiously with its use of big data to support underwriting. He describes the company's interest in big data as being like "a kid in a toy store" and says the carrier has gone beyond experimentation, devoting a great deal of resources in its development and testing of predictive modeling for underwriting. "Predictive modeling will evolve," he says. "We could plug it in and make it work tomorrow. The question is, can we make it work at the right price point? It has the potential to be the better mousetrap, but will it really catch mice? We don't know."

According to McFarland, Prudential will probably perform several more months of analysis before making a decision to deploy predictive modeling for underwriting. "Whether we'll roll it out remains to be seen," he says. "There are several factors: impact on product pricing, whether or not it reduces expenses required to reach an efficiency threshold, the ability to produce the margins we decided we wanted for that business, and whether we could gather support from our reinsurance partners."

McFarland adds, "As soon as one or two companies take the leap of faith for a given product or age group, then others will follow, because it will be necessary in order to compete."

Reprinted with permission from Insurance & Technology. October 2012.

sascom magazine logo 50% gray

What can you do with SAS® Visual Analytics?

  • Apply the power of SAS Analytics to massive amounts of data.
  • Visually explore data at the speed of sight.
  • Share fresh insights with everyone, everywhere, via the Web or iPad®.

 

SAS CEO Jim Goodnight explains SAS® Visual Analytics

"SAS Visual Analytics helps business users to visually explore data on their own, but it goes well beyond traditional query and reporting. Running on low-cost, industry-standard blade servers, its high-performance in-memory architecture delivers answers in seconds or minutes instead of hours or days."


Back to Top