Skip to content
Discussion options

You must be logged in to vote

At time of writing my original post, I was not aware of how to wrap the tool response into an AI message.
I ran some tests and it's working now.

def _get_chat_llm_with_history():
    chat_llm = llm.with_structured_output(answer_schema, include_raw=False)
    prompt_template: ChatPromptTemplate = _get_prompt_template()
    runnable = prompt_template | chat_llm | wrap_tool_response_in_AImessage
    return RunnableWithMessageHistory(
        runnable=runnable,
        get_session_history=session_store.get_session_history,
        input_messages_key="input",
        history_messages_key="chat_history",
        output_messages_key=None,
    )

def wrap_tool_response_in_AImessage(response):
    r…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@filgit
Comment options

Answer selected by filgit
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant