Oven logo

Oven

GitHub Release Website Apache License Version 2.0 PyPI Downloads

Neural Network Compression Framework (NNCF)

Neural Network Compression Framework (NNCF) provides a suite of post-training and training-time algorithms for optimizing inference of neural networks in OpenVINO™ with a minimal accuracy drop.

NNCF is designed to work with models from PyTorch, TensorFlow, ONNX and OpenVINO™.

The framework is organized as a Python package that can be built and used as a standalone tool. Its architecture is unified to make adding different compression algorithms easy for both PyTorch and TensorFlow.

NNCF provides samples that demonstrate the usage of compression algorithms for different use cases and models. See compression results achievable with the NNCF-powered samples on the NNCF Model Zoo page.

For more information about NNCF, see:

Table of contents

Key Features

Post-Training Compression Algorithms

Compression algorithmOpenVINOPyTorchTensorFlowONNX
Post-Training QuantizationSupportedSupportedSupportedSupported
Weight CompressionSupportedSupportedNot supportedNot supported
Activation SparsityNot supportedExperimentalNot supportedNot supported

Training-Time Compression Algorithms

Compression algorithmPyTorchTensorFlow
Quantization Aware TrainingSupportedSupported
Mixed-Precision QuantizationSupportedNot supported
SparsitySupportedSupported
Filter pruningSupportedSupported
Movement pruningExperimentalNot supported
  • Automatic, configurable model graph transformation to obtain the compressed model.

    NOTE: Limited support for TensorFlow models. Only models created using Sequential or Keras Functional API are supported.

  • Common interface for compression methods.
  • GPU-accelerated layers for faster compressed model fine-tuning.
  • Distributed training support.
  • Git patch for prominent third-party repository (huggingface-transformers) demonstrating the process of integrating NNCF into custom training pipelines.
  • Seamless combination of pruning, sparsity, and quantization algorithms. Refer to optimum-intel for examples of joint (movement) pruning, quantization, and distillation (JPQD), end-to-end from NNCF optimization to compressed OpenVINO IR.
  • Exporting PyTorch compressed models to ONNX* checkpoints and TensorFlow compressed models to SavedModel or Frozen Graph format, ready to use with OpenVINO™ toolkit.
  • Support for Accuracy-Aware model training pipelines via the Adaptive Compression Level Training and Early Exit Training.

Installation Guide

NNCF can be installed as a regular PyPI package:

pip install nncf

For detailed installation instructions, refer to the Installation guide.

System Requirements

  • Ubuntu 18.04 or later (64-bit)
  • Python 3.8 or later
  • Supported frameworks:
    • PyTorch >=2.2, <2.5
    • TensorFlow >=2.8.4, <=2.15.1
    • ONNX ==1.16.0
    • OpenVINO >=2022.3.0

Third-party Repository Integration

NNCF may be easily integrated into training/evaluation pipelines of third-party repositories.

  • OpenVINO Training Extensions

    NNCF is integrated into OpenVINO Training Extensions as a model optimization backend. You can train, optimize, and export new models based on available model templates as well as run the exported models with OpenVINO.

  • HuggingFace Optimum Intel

    NNCF is used as a compression backend within the renowned transformers repository in HuggingFace Optimum Intel.

NNCF Compressed Model Zoo

A list of models and compression results for them can be found at our NNCF Model Zoo page.

Citing

@article{kozlov2020neural,
    title =   {Neural network compression framework for fast model inference},
    author =  {Kozlov, Alexander and Lazarevich, Ivan and Shamporov, Vasily and Lyalyushkin, Nikolay and Gorbachev, Yury},
    journal = {arXiv preprint arXiv:2002.08679},
    year =    {2020}
}

Telemetry

NNCF as part of the OpenVINO™ toolkit collects anonymous usage data for the purpose of improving OpenVINO™ tools. You can opt-out at any time by running the following command in the Python environment where you have NNCF installed:

opt_in_out --opt_out

More information available on OpenVINO telemetry.