Replies: 1 comment
-
We experience the same errors, Did you manage to get the library run correctly? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone
I love the Transformers4Rec and try to learn through the online tutorial. I followed the
In the 3rd notebook : 03-serving-session-based-model-torch-backend.ipynb, I tried to launch the Triton Inference Server, but failed.
The error message contains:
I deployed Triton Inference Server through NGC container, as described in https://github.com/triton-inference-server/server, the Triton server returned the expected result as in "Serve a Model in 3 Easy Steps". I am not sure if I miss something when I followed the Transformers4Rec tutorial, could you please give me a clue?
By the way, I also report the details here:
Thank you in advance
Wei
Beta Was this translation helpful? Give feedback.
All reactions