10 design elements to consider before building an analytical model

Building a great house takes careful, thoughtful planning – starting with the foundation. Even if you’re lured by the promise of ocean views and a great tan, you should still heed the old saying and never build a house on sand. You need to be mindful of the future and create a flawless construction plan to ensure it can be enjoyed for generations to come.

The same advice applies to building an analytical model. It’s easy to get caught up in the grand design of what you might achieve, but enthusiasm won’t replace a rock-solid foundation. While it may be tempting to just start building your models and hoping for the best, you should exercise care or your project may collapse later on.

Here are some questions to get you started on a good design path:

  1. Who needs to access the results? Will the results be provided to a case manager or investigator?
  2. What do users need to see? Do I need to push out a score or a list to them? Do they need to know why? Do they need to see all of the data or only those cases that are relevant?
  3. Do I need to be able to explain and defend what the model is doing, or is a score enough?
  4. What systems do I need to integrate this solution with? Are they working in a reporting or case management system? Or an inventory planning system? Can I extend it? Do I need to add case management?
  5. Do users need to view results in real time? Is batch the way to go?
  6. Will I need to use a hybrid approach, or will we be using one model?
  7. Are users going to score multiple things at once (like a recommendation engine), or are they predicting one outcome at a time?
  8. What data will be available at the time users want to score?
  9. What is acceptable model performance?
  10. How will you measure success?

 

Remember that all advanced analytics go out of date and must be updated. Unfortunately, many organizations fail to update their models because it is too difficult. Not the model-building part, but the redeployment part. So make sure that you plan for this ahead of time. In some cases, especially in online retail, they need to be changed daily; others may not even require an annual refresh.

For example, a top US credit company used to manually recode all of their models to run in their operational environment. Once the model was built, it took six months to implement. Fraud prevention models are often used for months and years before they are updated — far past their viability. Clearly, as soon as fraudsters find your vulnerability, they find a different route to continue to commit fraud, leaving analytic models in the dust. If they had considered this part of the equation from the beginning, they could’ve saved themselves considerable effort.

Resist the urge to start working on your project before fully considering these questions. If you’re still unconvinced, consider this cautionary tale. An analyst developed a very complex model in an open-source tool and was trying to implement it. The only way he could do this was to manually recode his work. Manually! Had he started the process with the end in mind, he would have saved his agency time and money.

This should almost go without saying, but hard-coded deployments should be avoided. In order to have long-term success with an analytic solution, an organization should select technologies that are preconfigured to work together, designed to monitor performance over time and alert you when changes should be made.