-
Notifications
You must be signed in to change notification settings - Fork 170
Support for conversations with message history #234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
* an extra llm call adds to latency and cost, but including the entire - or even part of - the chat history can potentially create a very large embedding context
* from list to anthropic.MessageParam
d58871f to
6288907
Compare
4ed4803 to
a362fd3
Compare
* an idea of how to override the system instructions for some invokations
4c62827 to
07038dd
Compare
* for the type declaration of the `message_history` parameter
4f1b0f6 to
abef33c
Compare
* bring back list[dicy[str,str]] type declaration for the `message_history` parameter
c42e732 to
819179e
Compare
| summarization_prompt = ChatSummaryTemplate().format( | ||
| message_history=message_history | ||
| ) | ||
| summary = self.llm.invoke( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm wondering if we should allow the user to use a different LLM for summarization. I'm thinking users might want to use a "small" LLM for this simple task, and use a "better" one for the Q&A part. But we can leave it for a later improvement.
* at the same time as making mypy shut up
* to help with the type declaration of the message history
58ce6af to
fa12a9f
Compare
|
@CodiumAI-Agent /update_changelog |
|
Changelog updates: 🔄 2024-12-19Added
Changed
|
* ... for query embedding and summarization to the GraphRAG class
stellasia
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work! 🥳🥳
Thanks for dealing with all the issues, especially the vendor specificity 🙏
|
Changelog updates: 🔄 2024-12-20Added
Changed
|
Description
Adding chat functionality with message history.
chat_historyparameter to theinvokemethod of theLLMInterface.chat_historyis a dict withroleandcontentkeys, whererolecan be either "user" or "assistant".system_instructionparameter to the LLM class instantiation (meaning that you have to create a separate LLM instance for each use case. The motivation being that onvertexai,system_instructionis set on theGenerativeModelobject, so separated from the question prompt).Type of Change
Complexity
Complexity: High
How Has This Been Tested?
Checklist
The following requirements should have been met (depending on the changes in the branch):