How do I include FunctionMessages in either an agent's prompt, or memory #9224
Replies: 1 comment
-
🤖 Hello, To include function outputs in an OpenAIFunctionAgent's prompt messages or memory, you can modify the In the current implementation, the method appends the action log and the observation to the Here is an example of how you can modify the def _construct_scratchpad(
self, intermediate_steps: List[Tuple[AgentAction, str]], function_outputs: List[Any]
) -> List[BaseMessage]:
"""Construct the scratchpad that lets the agent continue its thought process."""
thoughts: List[BaseMessage] = []
for (action, observation), function_output in zip(intermediate_steps, function_outputs):
thoughts.append(AIMessage(content=action.log))
human_message = HumanMessage(
content=self.template_tool_response.format(observation=observation)
)
thoughts.append(human_message)
thoughts.append(AIMessage(content=str(function_output))) # append function output
return thoughts In this modified version of the method, Please note that this is a simple example and you might need to adjust it according to your specific needs. For example, you might need to format the function outputs in a specific way before appending them to the You can find the original implementation of the I hope this helps! Let me know if you have any other questions. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to include function outputs in an
OpenAIFunctionAgent
's prompt messages or memory so that it can recall results of a tool without calling it again. Here is my current attempt:However, the ConversationBufferMemory does not seem to save tool outputs, as the model cannot remember the secret word in that code. Additionally, the output from memory.buffer_as_str is:
My second thought was to just include the function messages in the
extra_prompt_messages
argument. Because I am actually reading from a database for messages each time anyways, memory objects are not really needed. However, there is no equivalent ofAIMessagePromptTemplate
for function messages. Any ideas how this can be achieved so that the function outputs are correctly passed to OpenAI's functions API?Beta Was this translation helpful? Give feedback.
All reactions