SAS Model Manager Features List

Model registration

Model registration

  • Provides a web page that includes information on any new features and functionality released each month, as well as "how to" content and the ability to interact with the user community.
  • Provides secure, reliable, versioned storage for all types of models, as well as access administration, including backup and restore capabilities, overwrite protection and event logging.
  • Once registered, models can be searched, queried, sorted and filtered by attributes used to store them – type of asset, algorithm, input or target variables, model ID, etc – as well as user-defined propertied and editable keywords.
  • Add general properties as columns to the listing for models and projects, such as model name, role, type of algorithm, date modified, modified by, repository location, description, version and keywords (tags).
  • Access models and model-score artifacts using open REST APIs.
  • Directly supports Python models for scoring and publishing. Convert PMML and ONNX (using dlPy) to standard SAS model types. Manage and version R code like other types of code.
  • Provides accounting and auditability, including event logging of major action – e.g., model creation, project creation and publishing.
  • Export models as .ZIP format, including all model file contents for movement across environments.
  • Easily copy models from one project to another, simplifying model movement within the repository.

Analytical workflow management

Analytical workflow management

  • Create custom processes for each model using SAS Workflow Studio:
    • The workflow manager is fully integrated with SAS Model Manager so you can manage workflows and track workflow tasks within the same user interface.
    • Import, update and export generic models at the folder level – and duplicate or move to another folder.
  • Facilitates collaboration across teams with automated notifications.
  • Perform common model management tasks, such as importing, viewing and attaching supporting documentation; setting a project champion model and flagging challenger models; publishing models for scoring purposes; and viewing dashboard reports.
  • Transparency and analytics governance provides visibility into your analytical process with a centralized model repository, life cycle templates and version control. Ensures complete traceability and analytics governance.

Model scoring

Model scoring

  • Place a combination of Python, SAS or other open source models in the same project for users to compare and assess using different model fit statistics.
  • Set up, maintain and manage separate versions for models:
    • The champion model is automatically defined as a new version when the model is set as champion, updated or published in a project.
    • Choose challenger models to the project champion model.
    • Monitor and publish challenger and champion models.
  • Define test and production score jobs for SAS and Python models using required inputs and outputs.
  • Create and execute scoring tasks, and specify where to save the output and job history.
  • Compare models side-by-side to quickly evaluate and select the champion model from all competing models (SAS and open source) for a specific business problem.

Model deployment

Model deployment

  • Depending on the use case, you can publish models to batch/operational systems – e.g., SAS server, in-database, in-Hadoop/Spark, SAS Cloud Analytic Services (CAS) Server, or to on-demand systems using Micro Analytic Score (MAS) service.
  • Publish Python and SAS models to run time containers with embedded binaries and score code files. Promote run time containers to local Docker, AWS Docker and Amazon EKS (elastic kubernetes service) environments.
  • New Azure container publishing destination for open source models.
  • Publish SAS and open-source models to Azure Machine Learning as an Azure container.
  • Publish SAS models using SAS runtime container.

Model monitoring

Model monitoring

  • Monitor the performance of models with any type of score code. Performance reports produced for champion and challenger R, Python and SAS models include variable distribution plots, lift charts, stability charts, ROC, K-S and Gini reports with SAS Visual Analytics using performance-reporting output result sets.
  • Built-in reports display the measures for input and output data and fit statistics for classification and regression models to evaluate whether to retrain, retire or create new models. Performance reports for champion and challenger analytical models involving Python, SAS, R, etc., with different accuracy statistics are available.
  • Monitor performance of champion models for all projects using performance report definition and execution.
  • Schedule recurring and future jobs for performance monitoring.
  • Specify multiple data sources and time-collection periods when defining performance-monitoring tasks.
  • Generate custom performance reports, and create and monitor custom business KPIs with access to model performance data.
  • Generate custom, out-of-the-box KPI performance modeling using a convenient wizard, and get simple alert notifications.