On-Demand Webinar

Model Interpretability: What’s Inside A Black Box?

About the webinar

Have you ever asked yourself, “How can I explain a complicated machine learning (ML) model?”? You are not alone! The good news is that you are asking the right question. 

As machine learning models become more sophisticated, it becomes more difficult to interpret them. When models carry economic and social impacts, the ability to explain and understand them is essential. 

Join us to learn how SAS has incorporated top-notch techniques into their toolbox to make interpreting models easy and accessible. These techniques include: Partial Dependence (PD) plots, Individual Conditional Expectation (ICE) plots, Local Interpretable Model-Agnostic Explanation (LIME), and Kernel Shapley values (Kernel SHAP).

In this webinar, we'll answer questions that include:

  • What is model interpretability and why is it important?
  • What techniques are available to better explain a ML model?
  • What are best practices when dealing with model interpretability?

Have a SAS profile? To complete this form automatically Sign In

*
*
*
 
*
*
 

All personal information will be handled in accordance with the SAS Privacy Statement.

 
  Yes, I would like to receive occasional emails from SAS Institute Inc., its affiliates, and third parties whom SAS has a business relationship with about SAS products and services. I understand that I can withdraw my consent at any time by clicking the opt-out link in the emails.
 
 

About the Experts


Yen Nguyen
Solution Specialist - Data Sciences
SAS Canada

With a variety of business exposures and experience with retail analytics, specializing in marketing and demand planning, Yen aims to help businesses identifying critical business issues and proposing how to leverage the power of analytics effectively to achieve desired business values.


Marie Coolsaet
Solution Specialist - Data Sciences
SAS Canada

Leveraging a background in Electrical Engineering, 5 years working closely with SAS technology, and a passion for finding creative approaches to analytics use cases, Marie works to resonate the value and impact of data-driven decisions.