Skip to content

Conversation

corona10
Copy link

@corona10 corona10 commented Jan 3, 2025

Please let me know if I misunderstood the original intention.

  • Parallel warm-up options were exposed a very long time ago, but AFAIK, there is no way to set it on the model server side.
  • So it would be great if we can set this, and even if we use one thread, it would be beneficial to separate the specific thread pool ("Warmup_ThreadPool") if the user specify the number of thread explicitly.

Please let me know if I missed something or if I understand wrongly.
I just try to understand TensorFlow serving internally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant