The Knowledge Exchange / Business Analytics / The “data artist” must balance creativity and control for effective predictive analytics

The “data artist” must balance creativity and control for effective predictive analytics

Senior Industry Consultant for Insurance, SAS

There has been a lot of buzz over the past couple of years about the new role of the data scientist in organizations. A recent article in the Harvard Business Review touts the role of data scientist as the “sexiest job of the 21stcentury.” And what exactly are the key attributes of this role? According to the article, this person is a “data hacker, analyst, communicator, and trusted adviser.” However, as we laypeople in the business and technology world know, you can build the most elegant solution to a problem, but if you can’t embed it efficiently within a business process it is likely to have less impact. In fact, I think a better term for this role is “data artist.”

But the data artist doesn’t live in a vacuum. As competitive pressures increase the need for organizations to master analytics, internal analytics teams have increased their statistical sophistication, but are struggling to put their insights into operation. Many analytics groups find themselves managing production processes, while the demand for analytics, as well as the data volume and need for speed-to-insight, grow within the organization. A need emerges to balance the creative instincts of the scientist with the creation of a flexible analytics delivery framework that adapts and evolves to the unpredictability of innovation.

Over time, analytics processes can grow so complex that it becomes difficult to identify the root causes when the process breaks down, and a breakdown in the process can set the team back days if not weeks as resources try to find and fix the problem. Since much of the work in the data mining process is knowledge-based, the question becomes: What is the right framework to support the knowledge process?

It’s a delicate balance: Embracing formal controls requires adherence to a collective goal, often with predetermined standards or processes. This may conflict with the creative aspects or methods inherent in the data discovery and model development process.

Our data artists must be cognizant of the interdependencies between the design of their model and other components of the overall system. The model development and deployment process may span a number of functional or business areas, requiring the coordination of resources, data and technology across the organization. It’s a delicate balance: Embracing formal controls requires adherence to a collective goal, often with predetermined standards or processes. This may conflict with the creative aspects or methods inherent in the data discovery and model development process.

Over the past couple of years, I have worked with several analytics groups in the insurance industry to identify and streamline their workflow using the Lean methodology. Lean’s goal is to eliminate waste and improve efficiency. The Lean methodology provides a light toolkit for improving analytics lifecycle management – effectively and efficiently managing the elements (people, processes, data and technology) necessary for optimizing the model development process from conception to deployment.

The Lean methodology in particular provides a way for organizations to drive value in their products and services (in this case, the analytics product), by allowing workers to perform work in the most efficient and effective way possible. Lean organizations do more with less – less effort, resources and time – while providing customers (defined as anyone downstream from a process) with maximum value. Following are a few examples of how the methodology can be applied:

  • An analytics modeler in one organization spent six months performing the detective work necessary to define and understand the data needed to create a predictive model to answer a key business question. The actual modeling work took only two weeks. In those first six months, the modeler had to not only identify but set up meetings with multiple system and data owners across the organization as he began to build out the data set for the model. The organization lacked consistent data quality routines and metadata that would have reduced the need for this modeler to scour the company for knowledge and resources.
  • I worked with one organization to develop a data model to support its new customer-focused analytics initiatives. Leveraging a just-in-time approach to development, the team identified high value customer analytics activities in a use case format (such as customer retention, acquisition, next-best-offer, etc.) and phased in data delivery in support of these use cases. Subsequent iterations are driven by additional use cases, and drive further development of the data model. The iterative approach to the development of the data model insures that the correct data in the correct format is available to support prioritized business-specific processes.
  • In one analytics process analyzed, there were multiple teams involved, more than 100 process steps, and the lead time from idea generation to execution exceeded 100 days – and that’s if everything went as planned. An error in their workflow (which was rife with data quality issues, inconsistent quality control mechanisms and an insane amount of handoffs between teams), could result in rework that set the process back two to three weeks. The time spent uncovering the problems reduced the capacity of the entire analytics team and delayed the output of the insight.

As you can imagine, the first step down the path to lean is getting cross-functional stakeholders together and creating shared accountability. A critical success factor for lean is horizontal-process ownership, not vertical-functional ownership. The next step is to map the current state of the process (also known as value stream mapping) and identify areas for improvement.

One analytics team that I worked with last year began to use lean improvement techniques in an operational analytics process that gave them back five days per month.

One analytics team that I worked with last year began to use lean improvement techniques in an operational analytics process that gave them back five days per month. That’s five days that the team could use to focus on new and possibly game-changing initiatives for their organization. By inserting a little process discipline in their workflow, the team was able to increase their creative capacity (and job satisfaction for data artists!). Are you thinking lean?

For more on this topic, read the white paper: Make Your Analytics Lean.

Tags: , ,
  • Facebook
  • del.icio.us
  • Twitter
  • Digg
  • LinkedIn
  • email

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>