Groq - Mixtral #1630
michaelsolo221
started this conversation in
Ideas
Groq - Mixtral
#1630
Replies: 1 comment
-
|
Thanks for posting, we are currently looking into how to best offer more generic model support |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the feature or potential improvement
Hi Team,
Raising this request to have Langfuse record model name: Mixtral via Groq.
Using Groq ATM with mixtral-8x7b-32768.
When using invoke(), i get warning message:
WARNING:langfuse:Langfuse was not able to parse the LLM model. The LLM call will be recorded without model name. Please create an issue so we can fix your integration: https://github.com/langfuse/langfuse/issues/new/choose
`chat = ChatGroq(temperature=0, model_name="mixtral-8x7b-32768")
system = "You are a helpful assistant."
human = "{text}"
prompt = ChatPromptTemplate.from_messages([("system", system), ("human", human)])
chain = prompt | chat
chain.invoke({"text": "Explain the importance of low latency LLMs."}, config={"callbacks": [langfuse_handler]})`
Additional information
i have searched for similar discussion re: groq, none found
Beta Was this translation helpful? Give feedback.
All reactions