Conversation
TSIA Adding streaming functionality to text summarization model parser ## Test Plan Rebase onto and test it with 11ace0a. Follow the README from AIConfig Editor https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev, then run these command ```bash aiconfig_path=/Users/rossdancraig/Projects/aiconfig/cookbooks/Gradio/huggingface.aiconfig.json parsers_path=/Users/rossdancraig/Projects/aiconfig/cookbooks/Gradio/hf_model_parsers.py alias aiconfig="python3 -m 'aiconfig.scripts.aiconfig_cli'" aiconfig edit --aiconfig-path=$aiconfig_path --server-port=8080 --server-mode=debug_servers --parsers-module-path=$parsers_path ``` Then in AIConfig Editor run the prompt (it will be streaming format by default) https://github.com/lastmile-ai/aiconfig/assets/151060367/e91a1d8b-a3e9-459c-9eb1-2d8e5ec58e73
| new_text = new_text.replace("</s>", "") | ||
| new_text = new_text.replace("<s>", "") |
There was a problem hiding this comment.
Do these just signify the start and end of the stream?
There was a problem hiding this comment.
Not quite! I'm really not sure why they show up for a few models while for TextGeneration this isn't needed.
I saw something like </s><s>Streaming Text</s> But yea just removed so it looks clean
| should_stream = (options.stream if options else False) and ( | ||
| not "stream" in completion_data or completion_data.get("stream") != False | ||
| ) |
There was a problem hiding this comment.
This is a very complicated logic. Ideally would prefer to break this out into if/else or create a helper function for this, but not ship-blocking if it works. I just am struggling to think how this would work.
For e.g. if no options is specified, stream will be false, even if there is stream in completion_data. Is that correct behavior?
There was a problem hiding this comment.
if no options is specified, stream will be false, even if there is stream in completion_data. Is that correct behavior?
That is correct! The reasoning for this is that stream isn't really a model setting, it's an inference setting which we would need to pass in a stream callback, so streaming without it doesn't really make sense if it's not included (where would you stream out to?).
Sure I made a task to split this into a helper function: #861
[HF][5/n] Image2Text: Allow base64 inputs for images Before we didn't allow base64, only URI (either local or http or https). This is good becuase our text2Image model parser outputs into a base64 format, so this will allow us to chain model prompts! ## Test Plan Rebase and test on 0d7ae2b. Follow the README from AIConfig Editor https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev, then run these command ```bash aiconfig_path=/Users/rossdancraig/Projects/aiconfig/cookbooks/Gradio/huggingface.aiconfig.json parsers_path=/Users/rossdancraig/Projects/aiconfig/cookbooks/Gradio/hf_model_parsers.py alias aiconfig="python3 -m 'aiconfig.scripts.aiconfig_cli'" aiconfig edit --aiconfig-path=$aiconfig_path --server-port=8080 --server-mode=debug_servers --parsers-module-path=$parsers_path ``` Then in AIConfig Editor run the prompt (streaming not supported so just took screenshots) These are the images I tested (with bear being in base64 format)   <img width="1281" alt="Screenshot 2024-01-10 at 04 57 44" src="https://github.com/lastmile-ai/aiconfig/assets/151060367/ea60cbc5-e6ab-4bf2-82e7-17f3182fdc5c"> --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/856). * __->__ #856 * #855 * #854 * #853 * #851
[HF][streaming][1/n] Text Summarization
TSIA
Adding streaming functionality to text summarization model parser
Test Plan
Rebase onto and test it with 11ace0a.
Follow the README from AIConfig Editor https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev, then run these command
Then in AIConfig Editor run the prompt (it will be streaming format by default)
Screen.Recording.2024-01-10.at.01.16.45.mov
Stack created with Sapling. Best reviewed with ReviewStack.