Cohere LLM astream_complete #15843
Unanswered
anirbanbasu
asked this question in
Q&A
Replies: 1 comment 6 replies
-
The error Here is the correct way to handle the asynchronous generator: async for response in llm.astream_complete(prompt):
# Process each response chunk
print(response) This approach works for Cohere LLM and should be used instead of |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
Is there any reason why calling
generator = await llm.astream_complete(prompt)
on an instance of Cohere LLM (from llama_index.llms.cohere import Cohere
) will throw the following error while there is no problem with Groq, Open AI, Anthropic and Ollama?TypeError: object async_generator can't be used in 'await' expression
I am baffled. Is this intended?
(Note that API key has been set correctly using the
api_key
parameter during initialising an object of theCohere
class.)Beta Was this translation helpful? Give feedback.
All reactions