Conversation
| collection_name | ||
| ) | ||
| # todo: retrieve messages from chat history as BaseMessage | ||
| messages = [] |
There was a problem hiding this comment.
I think we have tree options for fetching the history of user:
- The request body contains the history. On server side before sending the request to the Genai service, we can attach the history to the request. Also, chat flow is managed by server. Moreover, we can attach user's food preferences.
- Genai service can query the mongodb.
- Genai service can request history from server.
Personally, I prefer the first one. The flow is managed easier.
There was a problem hiding this comment.
I also thought about the first two options, but the first one makes also better sense to me. Therefore, we would not handle conversationId logic in the genai service either. Lets use the first one
genai/service/openwebui_service.py
Outdated
|
|
||
| from genai.config import Config | ||
|
|
||
| BASE_URL = "https://gpu.aet.cit.tum.de/" |
There was a problem hiding this comment.
Lets make it a config so that we can easily change the LLM provider.
genai/rag/llm/chat_model.py
Outdated
| stop=None, | ||
| **kwargs) -> ChatResult: | ||
| prompt = "\n".join([ | ||
| msg.content for msg in messages if isinstance(msg, HumanMessage) |
There was a problem hiding this comment.
Does it mean that we add only user's messages as context? If so, shouldn't we include all chat?
There was a problem hiding this comment.
Yes correct, inlcuding llm's messages has sometimes a downside, where the llm can hallucinate from its previous responses in the chat history and could not fully generate a correct answer for an already given query. (If the given response was not correct in the chat history)
But if we think about our application, i think it wont be a huge issue, since we prioritize the context first and utilize a full RAG functionality. So I will add ai messages in response generation 👍
| messages (List[Dict]): Full conversation history, each with 'role' and 'content' | ||
| Example: | ||
| [ | ||
| {"role": "user", "content": "I have eggs and tomatoes."}, |
There was a problem hiding this comment.
On server role is an enum and stored all capital letters. It is better to check the role case insensitive.
There was a problem hiding this comment.
@esadakcam I changed it, thanks! I just checked server code, and I think there is typo in server enums, it should be ASSISTANT instead of ASISTANT. Since I dont want to break the structure in server code, could you change it in a follow up?
/genai/generate-- Body must contain the
queryandconversation_id(or another kind of id, which is necessary to make a db call to fetch user chat history, will be further discussed in the week).