When it works - and when it doesn't
There is a lot of buzz around the idea of automating analytics. Companies, weary of investing huge amounts of time and money into a project that yields a one-time boost in sales or profits, are lured by the idea of automated analysis that solves problems without the need for teams of specialized experts or high-priced consultants.
The reality is that there is no magic button. Analytics practitioners can automate many analytical processes allowing your experts – whether they investigate claims or build marketing lists – to work more efficiently. Accounting software didn't replace accountants, and automating analytic functions doesn't replace modelers and analysts. The initial processes still need to be built and automated. There will also be work to maintain and modify the processes over time as business needs, data structures or other factors change.
What automation can do is power huge efficiency gains and allow a company to cost-effectively explore and test models to find the right customers for a specific offer or the optimal way to flag suspect claims. The combination of a well-designed data warehouse like those provided by Teradata and high-powered analytics like those provided by SAS helps automate scoring, validation and tuning, leaving users more time to create and explore. It allows companies to work with large volumes of data quickly and efficiently. Having an analytical development environment, or sandbox, within your Teradata data warehouse, or in an attached Teradata analytic appliance, is key to enabling the analysts to develop, test, automate and deploy their processes.
Automation helps companies expand into new markets
When the process is automated, it is much easier to use multiple models, test them and tweak them. If it takes weeks or months to build a model, the market changes before a company can test several – or they must engage many more modelers in the process. Once the right model is selected, it can then be deployed repeatedly for the specific business issue (selecting potential shoppers for teen offers), tweaked along the way to account for customers aging out of the market, or the addition of new stores in a geographic area.
Four ways analytics will change in 10 years
Staying ahead of the curve isn't easy; but it's important. Four of the issues for the next decade are:
The importance of speed and accuracy
Because of this, the company often took a shortcut – chunking up its list so that it built a model in SAS against a list of 350,000 customers instead of the entire database.
When it created a data store in Teradata with 1,400 variables and went with a high-powered analytic approach, data preparation took just 90 minutes. As is common, simply standardizing and automating the creation of key metrics of interest can drive huge dividends by itself. If an analyst can go right to doing analysis with up-to-date data instead of spending a lot of time simply getting the data together, productivity will greatly increase. In the case of the financial firm, models were built in two weeks (versus 14) and run time dropped to 36 minutes. Most importantly, the new approach brought in about $1 million a month in payments from customers at risk for default.
How not to automate
It is also important when you implement an automated process to have specific dates planned to revisit the process. Together, SAS and Teradata provide all the tools needed to deploy, manage and update your models. Even the best process will weaken over time as the market changes, business evolves and customers mature. A process should be expected to be outdated and in need of updating or retiring within a few quarters or at most a few years. Just as you can't buy a house and never maintain or renovate it, you can't build an analytic process and do nothing more either.
This story appears in the Fourth Quarter 2011 issue of