Are you ready for Analytics 3.0?
Don’t worry. We got analytics pioneer Tom Davenport to tell us everything you need to know
A new age is dawning in the analytics industry: Analytics 3.0. It marks a new analytical era. It's about the widespread creation of new products and services based on analytics, and the point at which leading organizations will embed analytics directly into decision and operational processes. And the ones who get there will realize measurable business advantage from the combination of traditional analytics and big data.
This Q&A with Tom Davenport, Director of Research for the International Institute for Analytics (IIA), will help you understand how analytics is evolving, where you need to go, and how to get there.
What are the characteristics of Analytics 3.0?
Davenport: The most important aspect is that analytics is driving not only major operational and strategic decisions, but also the creation of new products and services for companies in every industry – not just online firms, as in the big data era. The easiest way for me to explain the other attributes is to give you a list.
- In a synthesis of traditional analytics (1.0) and big data (2.0), organizations are combining large and small volumes of data, internal and external sources, and structured and unstructured formats to yield new insights in predictive and prescriptive models.
- Analytics is central to the organization’s strategy.
- The "Hadoop-alooza" (excitement about big data technologies) continues, but often as a way to provide fast and cheap warehousing or persistence and structuring of data before analysis.
- Faster technologies such as in-database and in-memory analytics are being coupled with agile analytical methods and machine learning techniques that produce insights at a much faster rate.
- Many analytical models are being embedded into operational and decision processes, dramatically increasing their speed and impact.
- Data scientists, who excel at extracting and structuring data, are working with conventional quantitative analysts who excel at modeling it – and the combined teams are doing whatever is necessary to get the analytical job done.
- Companies are beginning to create chief analytics officer roles or equivalent titles to oversee the building of analytical capabilities.
- Tools that support particular decisions are being pushed to the point of decision making in highly targeted and mobile analytical apps.
What defines Analytics 1.0?
Davenport: Analytics 1.0 represents an era in which enterprises start assembling business intelligence systems and expertise to drive reporting and descriptive analytics. During this era, very few enterprises view their systems as capable of generating predictive or prescriptive analytics. Enterprises focus on the internal, structured data that they generate without giving much thought to other types or sources of data. In this era, most organizations do not view their data as a valuable asset, like equipment or inventory.
What defines Analytics 2.0?
Davenport: The primary difference from Analytics 1.0 is the emergence of big data: fast moving, external, large and unstructured data coming from various new and interesting sources. As such, it has to be stored and processed rapidly, often with parallel servers running technologies like Hadoop.
The overall speed of analytics increases, and visual analytics (a form of descriptive analytics) gains prominence; however, predictive and prescriptive techniques are still not the main use of analytics. The users are primarily online firms.
In this stage, a new community of data scientists emerges that fosters experimentation, hacking and data mashups. Regardless of industry, most enterprises are discussing new data product business opportunities that may lie ahead of them. Big data is still very popular and, for many organizations, remains a challenge they are struggling to overcome.
Among these three analytic eras, where do you see most businesses operating?
Davenport: The majority of companies today are still operating within Analytics 1.0 and, in online firms, 2.0. But industry leaders are entering the Analytics 3.0 world. To achieve competitive advantage, firms must prepare for and embrace Analytics 3.0.
Who are some of the companies leading the Analytics 3.0 charge?
Davenport: Procter & Gamble is a prime example. P&G CEO Bob McDonald and CIO Filippo Passerini share an enthusiasm for sophisticated analytics and have created a culture that supports moving business intelligence from the periphery of operations to the center.
GE is another early adopter. They're investing multiple billions of dollars in a new center for software and analytics. The goal is to offer new services based on the analysis of big data from industrial products – they're putting sensors in gas turbines, jet engines and locomotives.
You see it in other industries as well. In pharmaceuticals, bioinformatics and computational biology are becoming as important as chemistry in creating new drugs. In auto insurance, several companies, including Progressive and State Farm, are putting devices in cars that measure how far and how fast you drive, and pricing your insurance accordingly.
Are there specific tools required for Analytics 3.0?
Davenport: Companies need to have the technical infrastructure to manage the volume of data. This includes technologies like Hadoop, in-memory and in-database analytics, and enough computing power to handle the complex calculations. In addition, companies need appropriate tools to effectively support decision making at the front lines, such as mobile and self-serve analytical apps.
Which industries will be most affected by Analytics 3.0? Who will benefit the most?
Davenport: All industries will be affected by Analytics 3.0. The benefits from effectively leveraging big data, embedding data into decision making and truly becoming an analytical competitor will apply to any firm in any industry.
However, the speed with which companies are able to enter this world will depend on the analytical maturity of the industry. Transport, retail and banking are already highly analytical; there are firms in these industries already entering the era of Analytics 3.0. By contrast, I get frustrated with other industries that have big data but don't use it much, like telecom and entertainment.
What role will data scientists play in this evolution?
Davenport: Data scientists are a critical element of the shift to Analytics 3.0. They have the skills to extract and structure the complex, high-volume data sets that organizations use. However, they need to work closely with IT and traditional quantitative analysts to develop insights for the business. It is critical that there is close collaboration and communication between the business, IT and analytics teams.
How can companies prepare for Analytics 3.0?
Davenport: Companies can prepare for Analytics 3.0 in several ways. They need to start with discussions among senior management about how they play in the data economy and what resources they already have. They will also need to create a chief analytics officer (or equivalent role) to oversee the strategic deployment of analytics.
Next, they'll need to invest in the technology needed to manage big data and provide insights quickly.
Finally, companies will need to recruit, retain and effectively use analytical talent. They'll also need to ensure that analytics groups are aligned with the business and focus on critical business questions.
Three ways to do high-performance analytics
To ensure that you have the right combination of technologies to meet the demands of Analytics 3.0, SAS offers three distributed processing options:
SAS® In-Memory Analytics
Big data and intricate analytical computations are processed in memory and distributed across a dedicated set of blades to produce highly accurate insights that you can use to solve complex problems in near-real time.
Data integration and analytic functions are executed inside the database, which enables better data governance and speeds time to insight since there's no need to move or convert data repeatedly.
SAS® Grid Computing
SAS jobs are processed in a shared, centrally managed pool of IT resources, which promotes efficiency, lower cost and better performance.