You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+20-20Lines changed: 20 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,8 +11,8 @@ This is an attempt at a Rust wrapper for
11
11
12
12
This project consist on two crates:
13
13
14
-
*[`onnxruntime-sys`](onnxruntime-sys): Low-level binding to the C API;
15
-
*[`onnxruntime`](onnxruntime): High-level and safe API.
14
+
-[`onnxruntime-sys`](onnxruntime-sys): Low-level binding to the C API;
15
+
-[`onnxruntime`](onnxruntime): High-level and safe API.
16
16
17
17
[Changelog](CHANGELOG.md)
18
18
@@ -21,25 +21,25 @@ which provides the following targets:
21
21
22
22
CPU:
23
23
24
-
* Linux x86_64
25
-
* macOS x86_64
26
-
* macOS aarch64 (no pre-built binaries, no CI testing, see [#74](https://github.com/nbigaouette/onnxruntime-rs/pull/74))
27
-
* Windows i686
28
-
* Windows x86_64
24
+
- Linux x86_64
25
+
- macOS x86_64
26
+
- macOS aarch64 (no pre-built binaries, no CI testing, see [#74](https://github.com/nbigaouette/onnxruntime-rs/pull/74))
27
+
- Windows i686
28
+
- Windows x86_64
29
29
30
30
GPU:
31
31
32
-
* Linux x86_64
33
-
* Windows x86_64
32
+
- Linux x86_64
33
+
- Windows x86_64
34
34
35
35
---
36
36
37
37
**WARNING**:
38
38
39
-
* This is an experiment and work in progress; it is _not_ complete/working/safe. Help welcome!
40
-
* Basic inference works, see [`onnxruntime/examples/sample.rs`](onnxruntime/examples/sample.rs) or [`onnxruntime/tests/integration_tests.rs`](onnxruntime/tests/integration_tests.rs)
41
-
* ONNX Runtime has many options to control the inference process but those options are not yet exposed.
42
-
* This was developed and tested on macOS Catalina. Other platforms should work but have not been tested.
39
+
- This is an experiment and work in progress; it is _not_ complete/working/safe. Help welcome!
40
+
- Basic inference works, see [`onnxruntime/examples/sample.rs`](onnxruntime/examples/sample.rs) or [`onnxruntime/tests/integration_tests.rs`](onnxruntime/tests/integration_tests.rs)
41
+
- ONNX Runtime has many options to control the inference process but those options are not yet exposed.
42
+
- This was developed and tested on macOS Catalina. Other platforms should work but have not been tested.
43
43
44
44
---
45
45
@@ -58,14 +58,14 @@ To select which strategy to use, set the `ORT_STRATEGY` environment variable to:
58
58
3.`compile`: To compile the library
59
59
60
60
The `download` strategy supports downloading a version of ONNX that supports CUDA. To use this, set the
61
-
environment variable `ORT_USE_CUDA=1` (only supports Linux or Windows).
61
+
feature `cuda` in `Cargo.toml`.
62
62
63
63
Until the build script allow compilation of the runtime, see the [compilation notes](ONNX_Compilation_Notes.md)
64
64
for some details on the process.
65
65
66
66
### Note on using CUDA
67
67
68
-
To use CUDA you will need to set `ORT_USE_CUDA=1` but also to set your session with the method `use_cuda` as such:
68
+
To use CUDA you will need to set the feature `cuda` but also to set your session with the method `use_cuda` as such:
69
69
70
70
```
71
71
let mut session = environment
@@ -86,9 +86,9 @@ dyld: Library not loaded: @rpath/libonnxruntime.1.7.1.dylib
86
86
87
87
To fix, one can either:
88
88
89
-
* Set the `LD_LIBRARY_PATH` environment variable to point to the path where the library can be found.
90
-
* Adapt the `.cargo/config` file to contain a linker flag to provide the **full** path:
91
-
89
+
- Set the `LD_LIBRARY_PATH` environment variable to point to the path where the library can be found.
90
+
- Adapt the `.cargo/config` file to contain a linker flag to provide the **full** path:
0 commit comments