SAS® Open Model Manager Features

Model registration

  • Provides secure, reliable, versioned storage for all types of models.
  • Once imported, models can be searched, queried, sorted and filtered by attributes associated with the models – type of asset, algorithm, input or output variables, target variables, model ID, etc. – as well as user-defined properties and editable keywords.
  • Add general properties as columns to the listing for models and projects, such as model name, role, type of algorithm, date modified, modified by, repository location, description, version and keywords (tags).
  • Access models and model-score artifacts using open REST APIs.
  • Directly supports Python models for scoring and publishing. R code can be managed and versioned like other types of code.
  • Provides accounting and auditability, including event logging of major actions – e.g., model creation, project creation and publishing.
  • Export models as ZIP format, including all model file contents for movement across environments.
  • Easily copy models from one project to another, simplifying model movement within the repository. 

Model scoring

  • A combination of Python or other open source models can be placed in the same project for users to compare and assess using different model fit statistics.
  • Set up, maintain and manage separate versions for models:
    • The project champion model is automatically defined as a new version when the model is set as champion, updated or published in a project.
    • Choose challenger models to the project champion model.
  • Define test and production score jobs for Python and other open source models using required inputs and outputs.
  • Create and execute scoring tasks, and specify where to save the output and job history.

Model deployment

  • Depending on the use case, you can publish models to batch/operational systems – e.g., in-database and in-streaming.
  • Publish Python and other open source models to run time containers with embedded binaries and score code files. Promote run time containers to local Docker and AWS Docker environments.

Model monitoring

  • Monitor the performance of models with any type of score code over time. Produce performance charts for models, including:
    • Variable distribution plots.
    • Lift charts.
    • Stability charts.
    • ROC
    • K-S
    • Gini index.
  • Schedule recurring and future jobs for performance monitoring.
  • Specify multiple data sources and time-collection periods when defining performance-monitoring tasks.


  • Packaged and delivered as one standalone container, with all tools included.
  • Container is ready to be deployed on-site.