Skip to content

OpenVINO™ Execution Provider for ONNXRuntime 5.9

Latest

Choose a tag to compare

@vthaniel vthaniel released this 25 Feb 12:20
· 1 commit to rel-1.24.1 since this release
06bc1ae

Description:
OpenVINO™ Execution Provider For ONNXRuntime v5.9 Release based on OpenVINO™ 2025.4.1 Release and OnnxRuntime 1.24.1 Release

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

This release supports ONNXRuntime 1.24.1 with OpenVINO™ 2025.4.1 release.
Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Platform Support Changes:

  • Python 3.10 wheels are no longer published.

Modifications:

  • Supports OpenVINO 2025.4.1.
  • Enable Session Option to Stop Context Sharing.
  • Implements single binary file support for OpenVINO EP context serialization and deserialization.
  • Optimizes stateful path handling.
  • Add Node to update KV cache in Stateful LLM model.
  • OV SDK Version check removed for EpCtx consumption flow on NPU.
  • Bug Fixes

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Package:
https://pypi.org/project/onnxruntime-openvino/

Installation and usage Instructions on Windows:

pip install onnxruntime-openvino

/* Steps If using python openvino package to set openvino runtime environment */
pip install openvino==2025.4.1
<Add these 2 lines in the application code>
import onnxruntime.tools.add_openvino_win_libs as utils
utils.add_openvino_libs_to_path()

C# Package:
Download the Microsoft.ML.OnnxRuntime.Managed nuget from the link below
https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.Managed
and use it with the Intel.ML.OnnxRuntime.OpenVino nuget from the link below
https://www.nuget.org/packages/Intel.ML.OnnxRuntime.OpenVino

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options