Is streaming results possible? (Not in a Jupyter notebook context) #118
Unanswered
andysalerno
asked this question in
Q&A
Replies: 1 comment
-
I am working on exposing a nice API for that. Will post in discussion when it lands :) The current plan to is do something like: for p in program(stream=True):
# do stuff with p, which is the partially executed program and an equivalent async version. Any comments on the API are welcome. The upside of what is above is that |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I'm having a blast prototyping with this library. It's really excellent!
One thing I haven't been able to figure out- is it possible to enable async streaming of tokens from the model?
I've seen the tutorials and API that show
stream=True
, but this seems to only have an effect on Jupyter notebook hosted scenarios? That's how it seems, anyway.I'm wondering, if I wanted to stream tokens in realtime to my own client, is there a supported way to do this?
In my own toy projects, I'm always showing token generation in realtime, since it's a much better user experience. But after switching to Guidance, I'm missing out on that. Everything else is 100x better with Guidance though :)
Beta Was this translation helpful? Give feedback.
All reactions