You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm building a Edge & Chrome extension that runs ONNX models using onnxruntime-web with the "cpu" execution provider (WASM backend) using a React Vite app. The extension works as expected across most websites, but I've noticed that model initialization consistently fails on LinkedIn with the following error:
Error: no available backend found.
[cpu] RuntimeError: Aborted(CompileError: WebAssembly.instantiate(): ...)
Context:
The .wasm files (ort-wasm-simd-threaded.wasm etc) are properly included in the extension's assets and configured via:
platform:webissues related to ONNX Runtime web; typically submitted using template
1 participant
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
I'm building a Edge & Chrome extension that runs ONNX models using onnxruntime-web with the "cpu" execution provider (WASM backend) using a React Vite app. The extension works as expected across most websites, but I've noticed that model initialization consistently fails on LinkedIn with the following error:
Error: no available backend found.
[cpu] RuntimeError: Aborted(CompileError: WebAssembly.instantiate(): ...)
Context:
The .wasm files (ort-wasm-simd-threaded.wasm etc) are properly included in the extension's assets and configured via:
ort.env.wasm.wasmPaths = chrome.runtime.getURL("assets/");
These assets are correctly listed in the extension's web_accessible_resources.
The same logic works fine on other websites like Instagram, X etc
Network tab confirms the WASM files are fetched successfully on LinkedIn as well.
Hypothesis:
This seems related to LinkedIn's CSP, possibly blocking WebAssembly.instantiate() from content scripts.
Questions:
Is onnxruntime-web known to have issues running in content scripts on pages with strict CSPs?
Would moving model inference to an iframe or offscreen document (within the extension context) be a recommended workaround?
Are there other suggested patterns to safely run ONNX inference in such restricted environments?
Any guidance or examples would be appreciated. Thanks!
Code snippet pattern for instantiating a onnx file :
this.model = await ort.InferenceSession.create(
modelPath,
{
executionProviders: ["cpu"],
graphOptimizationLevel: "all",
logSeverityLevel: 3,
}
);
Beta Was this translation helpful? Give feedback.
All reactions