-
Notifications
You must be signed in to change notification settings - Fork 116
PoC to test libtorch support for onnxnruntime-extensions #770
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
PoC to test libtorch support for onnxnruntime-extensions #770
Conversation
38a0978 to
bb4f87a
Compare
|
@thiagocrepaldi please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
Contributor License AgreementContribution License AgreementThis Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”),
|
bb4f87a to
01129d3
Compare
01129d3 to
67d96f8
Compare
This PR prototypes an extension that adds the ability to
onnxruntime-extensionsto execute kernels implemented using PyTorch'slibtorch.For this experiment, only libtorch for CPU only from https://download.pytorch.org/libtorch/nightly/cpu/libtorch-shared-with-deps-latest.zip was tested, but it should also work for GPU-enable libtorch
It is important to notice that the libtorch version (cpu/cuda) used to build the extension must match the PyTorch installed for runtime, otherwise undefined reference issues will happen
As a proof of concept, a dummy
MyReluop was introduced, in which it calls intotorch::ReLuoperator. This serves as a minimal example. There are implementations for both CUDA and CPU Relu, but only the CPU version has been tested.To replicate this PR on a clean environment, users must install the following as requirements:
Next, users must
export OCOS_LIBTORCH_PATH=/path/to/libtorchbefore callingpip install .from the onnxruntime-extensions repo.As an alernative, the full working environment for cpu-only can be pulled and tested from Docker hub using the following command: