Skip to main content
Version: nightly

Installing DeepSparse

DeepSparse is Neural Magic's inference engine, empowering you to run deep learning models on CPUs with exceptional performance and efficiency. This guide covers various installation methods, including PyPI and installation from the GitHub source code for advanced use cases.

Prerequisites

Hardware Requirements

  • CPU: x86 with AVX2 instructions (Intel Haswell or newer, AMD Zen 2 or newer) or ARM v8.2 or newer.
  • RAM: Minimum 1GB (model and configuration dependent)

Software Requirements

  • OS: Linux (Ubuntu, CentOS, Red Hat, etc.). MacOS (beta).
  • Python: 3.8 - 3.11
  • ONNX: Version 1.5.0 - 1.15.0 (opset 11 or higher)

Community Installation

PyPI

For the most common use case, install the current release version of DeepSparse using PyPI:

pip install deepsparse-nightly

GitHub (Advanced)

For development purposes or advanced use cases, install directly from the GitHub repository:

pip install git+https://github.com/neuralmagic/deepsparse.git@main

Or from a locally cloned repository:

git clone https://github.com/neuralmagic/deepsparse.git
cd deepsparse
git checkout main
pip install -e .[dev]

Product Usage Analytics

DeepSparse Community Edition gathers basic usage telemetry including, but not limited to, Invocations, Package, Version, and IP Address for Product Usage Analytics purposes. Review Neural Magic's Products Privacy Policy for further details on how we process this data.

To disable Product Usage Analytics, run the command:

export NM_DISABLE_ANALYTICS=True

Confirm that telemetry is shut off through info logs streamed with engine invocation by looking for the phrase "Skipping Neural Magic's latest package version check." For additional assistance, reach out through the DeepSparse GitHub Issue queue.

Enterprise Installation

PyPI

DeepSparse Enterprise provides enhanced features and production-grade support. Reach out to your dedicated Neural Magic representative to obtain a license key for production use.

pip install deepsparse-ent

Installing a License

Once you have obtained a license, you will need to initialize it to be able to run DeepSparse Enterprise. You can initialize your license by running:

deepsparse.license <license_string> or <path/to/license.txt>

To initialize a license on a machine:

  1. Confirm you have deepsparse-ent installed in a fresh virtual environment. Installing deepsparse and deepsparse-ent on the same virtual environment may result in unsupported behaviors.
  2. Run deepsparse.license with the <license_string> or path/to/license.txt as an argument:
  • deepsparse.license <samplelicensetring>
  • deepsparse.license ./license.txt
  1. If successful, deepsparse.license will write the license file to ~/.config/neuralmagic/license.txt. You may overwrite this path by setting the environment variable NM_CONFIG_DIR (before and after running the script) with the following command: export NM_CONFIG_DIR=path/to/license.txt
  2. Once the license is authenticated, you should see a splash message indicating that you are now running DeepSparse Enterprise.

If you encounter issues initializing your DeepSparse Enterprise License, contact Neural Magic through your dedicated support channel.

Validating a License

Once you have initialized your license, you may want to check that it is still valid before running a workload on DeepSparse Enterprise. To confirm your license is still active with DeepSparse Enterprise, run the command:

deepsparse.validate_license

deepsparse.validate_license can be run with no arguments, which will reference an existing environment variable (if set), or with one argument that is a reference to the license and can be referenced in the deepsparse.validate_license command as path/to/license.txt.

To validate a license on a machine:

  1. If you have successfully run deepsparse.license, deepsparse.validate_license can be used to validate that the license file is in the correct location:

    • Run the deepsparse.validate_license with no arguments. If the referenced license is valid, the DeepSparse Enterprise splash screen should display in your terminal window.
    • If the NM_CONFIG_DIR environment variable was set when creating the license, ensure this variable is still set to the same value.
  2. If you want to supply the path/to/license.txt:

    • Run deepsparse.validate_license with path/to/license.txt as an argument as: deepsparse.validate_license --license_path path/to/license.txt
    • If the referenced license is valid, the DeepSparse Enterprise splash screen should display in your terminal window.

Specialized Installations

Install DeepSparse with tailored support with the following extras for domain-specific use cases and tasks.

Generative AI: Hugging Face

For generative AI, particularly transformer architectures, this extra supports models like Llama, Mistral, MPT, GPT, and others. It enables compatibility of Hugging Face's transformers pipelines and models to DeepSparse allowing performant and memory-efficient inference.

pip install deepsparse-nightly[llm]

Object Detection: YOLOv8

For object detection, this extra provides built-in support for YOLOv8 models. It enables compatibility of YOLOv8 models and pipelines to DeepSparse, allowing performant and memory-efficient inference.

pip install deepsparse-nightly[yolov8]

Image Classification: TorchVision

For image classification, this extra provides built-in support for TorchVision models. It enables compatibility of TorchVision models and pipelines to DeepSparse, allowing performant and memory-efficient inference.

pip install deepsparse-nightly[image_classification]

Natural Language Processing: Hugging Face

This extra provides built-in support for Hugging Face's transformers models for natural language processing. It enables compatibility of Hugging Face's transformers pipelines and models to DeepSparse allowing performant and memory-efficient inference.

pip install deepsparse-nightly[transformers]

DeepSparse Server

For HTTP serving of deep learning models, this extra provides built-in support for DeepSparse Server. It enables serving any DeepSparse pipelines via HTTP and can be combined with other extras for additional model support.

pip install deepsparse-nightly[server]

ONNX Runtime

This extra provides built-in support for ONNX Runtime for benchmarking and comparison purposes. It enables running any DeepSparse pipelines via ONNX Runtime and can be combined with other extras for additional model support.

pip install deepsparse-nightly[onnxruntime]

Development

For development purposes, this extra provides built-in support for development tools.

pip install deepsparse-nightly[dev]