Skip to content

How to uninstall op from other providers to the default CPUExecutionProvider. #25695

@xiaohoua

Description

@xiaohoua

Describe the issue

I try to run kokorotts on windows+intel。it's success run with CPUExecutionProvider. but it want to run on intelgpu.so i try to use onnxruntime-openvino firstly ,but get error:

Error: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: C:\Users\Administrator\Documents\windows_wheels\onnxruntime\onnxruntime\core\providers\openvino\ov_interface.cc:87 class onnxruntime::openvino_ep::OVExeNetwork __cdecl onnxruntime::openvino_ep::OVCore::CompileModel(class std::shared_ptr<class ov::Model const > &,class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > &,class std::map<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >,class ov::Any,struct std::less<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > >,class std::allocator<struct std::pair<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > const ,class ov::Any> > > &,const class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > &) [OpenVINO-EP]  Exception while Loading Network for graph: OpenVINOExecutionProvider_OpenVINO-EP-subgraph_1_0Exception from src\inference\src\cpp\core.cpp:112:
Exception from src\inference\src\dev\plugin.cpp:53:
Check 'inputRank == 2 || inputRank == 4 || inputRank == 5' failed at src\plugins\intel_gpu\src\plugin\ops\interpolate.cpp:38:
Mode 'linear_onnx' supports only 2D or 4D, 5D tensors

when use openvinoCPU:

Error: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: C:\Users\Administrator\Documents\windows_wheels\onnxruntime\onnxruntime\core\providers\openvino\ov_interface.cc:87 class onnxruntime::openvino_ep::OVExeNetwork __cdecl onnxruntime::openvino_ep::OVCore::CompileModel(class std::shared_ptr<class ov::Model const > &,class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > &,class std::map<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >,class ov::Any,struct std::less<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > >,class std::allocator<struct std::pair<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > const ,class ov::Any> > > &,const class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > &) [OpenVINO-EP]  Exception while Loading Network for graph: OpenVINOExecutionProvider_OpenVINO-EP-subgraph_2_1Exception from src\inference\src\cpp\core.cpp:112:
Exception from src\inference\src\dev\plugin.cpp:53:
Exception from src\plugins\intel_cpu\src\node.cpp:90:
Unexpected: CPU plug-in doesn't support Parameter operation with dynamic rank. Operation name: /decoder/decoder/generator/STFT_output_0

then i try DmlExecutionProvider

Non-zero status code returned while running ConvTranspose node. Name:'/encoder/F0.1/pool/ConvTranspose' Status Message: D:\a\_work\1\s\onnxruntime\core\providers\dml\DmlExecutionProvider\src\MLOperatorAuthorImpl.cpp(2816)\onnxruntime_pybind11_state.pyd!00007FFAC893C70A: (caller: 00007FFAC8908545) Exception(2) tid(8148) 80070057

it seem like all error is op's error, So like issue'title ,i want to know how to avoid use this error op(because CPUExecutionProvider is ok, so run this error op with CPUExecutionProvider may be a method).
Can you gei me some advice?

To reproduce

pip install onnxruntime-openvino
pip install onnxruntime-directml
self.sess = rt.InferenceSession("D:\TTS\kokoro-tts\kokoro-v1.0.onnx", providers=providers, provider_options=[{'device_type' : "GPU"}])

onnxruntime 1.22.1
onnxruntime-gpu 1.22.0
onnxruntime-openvino 1.22.0
openvino 2025.0.0 17942
openvino-telemetry 2025.2.0
onnxruntime-directml 1.22.0

Urgency

No response

Platform

Windows

OS Version

windows11

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.22.1

ONNX Runtime API

Python

Architecture

X64

Execution Provider

OpenVINO

Execution Provider Library Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    ep:DMLissues related to the DirectML execution providerep:OpenVINOissues related to OpenVINO execution providerstaleissues that have not been addressed in a while; categorized by a bot

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions