Replies: 1 comment
-
We are still working on djl-serving documentation, and Workflow feature is still under development, see this PR for more detail. For regular model, djl-serving can serve it as tensor in tensor out out of box, you only need a url to the model file:
You can use
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'd like to apply djl-serving to serve model inference. Is there any more detailed docs about it, especially about how to define Workflow?
Beta Was this translation helpful? Give feedback.
All reactions