Skip to content

Conversation

Hrant-Khachatrian
Copy link

In case someone needs to generate many samples from one model on demand without loading the checkpoint every time, the easiest solution seems to have a HTTP server.

This requires Turbo networking framework.

This is not a serious contribution obviously, but might be useful for some people, especially if the generated text should be used by a non-Lua script. E.g. I use this server for serving a bot similar to DeepDrumpf written in Python.

@AlekzNet
Copy link

AlekzNet commented Nov 7, 2016

Thanks much! Works perfectly! And exactly what I need!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants