Invalid argument provided to Gemini: 400 Please ensure that function call turn comes immediately after a user turn or after a function response turn. #27137
-
Checked other resources
Commit to Help
Example Codefrom langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.schema.messages import SystemMessage
from langchain_community.chat_message_histories import ChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain.agents import AgentExecutor
from langchain.agents import create_tool_calling_agent
from assistant.memory import memory
class Assistant:
def __init__(self, llm, tools):
self.agent = self._create_inference_chain(llm, tools)
def answer(self, question: str, user_id: str) -> str:
"""
Process a user's question and generate a response.
:param question: The user's input question.
:param user_id: The unique identifier for the user.
:return: The AI-generated response to the question.
"""
if not question:
return
previous_memories = memory.search(question, user_id=user_id, limit=3)
relevant_memories_text = "\n".join(
mem["memory"] for mem in previous_memories["results"]
)
prompt = f"""
User input: {question}
Relevant memories:
{relevant_memories_text}
"""
response = self.agent.invoke(
{"prompt": prompt},
config={"configurable": {"session_id": "unused"}},
)
return response["output"]
def _create_inference_chain(self, llm, tools):
SYSTEM_PROMPT = """
You are an AI personal assistant with context awareness, long-term memory, ability to take and interpret screenshots using along with capturing images using device camera. Your job is to assist the user, handle queries regarding the screen and the image clicked using camera, remember key details from conversations, and provide personalized support. Use past interactions to adapt responses and make future conversations more efficient. Respond naturally like a human, without explaining the reasoning behind your responses or why you chose them.
"""
prompt_template = ChatPromptTemplate.from_messages(
[
SystemMessage(content=SYSTEM_PROMPT),
MessagesPlaceholder(variable_name="chat_history", n_messages=3),
(
"human",
[
{"type": "text", "text": "{prompt}"},
],
),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
agent = create_tool_calling_agent(llm, tools, prompt=prompt_template)
agent_executor = AgentExecutor(agent=agent, tools=tools)
chat_message_history = ChatMessageHistory()
return RunnableWithMessageHistory(
agent_executor,
lambda _: chat_message_history,
input_messages_key="prompt",
history_messages_key="chat_history",
) DescriptionPlease help me with this issue, I get it randomly. As far i understood there is some sequencing issue but i don't understand how could it be and how to even fix. It's been days without solution. I'm giving the code below as well for reference. Any guidance/solution is appreciated. Edit : I've added the Screenshot (debug)System InfoPackage Information
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
Hello, @Divyanshu9822! I'm here to help you with any bugs, questions, or contributions while you wait for a human maintainer. I found a similar open issue titled "Tool calling with Gemini through Vertex AI fails because of argument with Union type" which might be related to your problem. This issue is still open and was last updated on September 06, 2024 [1]. The error message you're encountering suggests that there might be a sequencing issue with the function call turn in your code. Specifically, the function call should come immediately after a user turn or a function response turn. This means that the order of messages in your chat history might not be correctly maintained, leading to the error. Here are a few things you can check or try to resolve the issue:
By ensuring that the message sequence is correctly maintained and that the configuration is properly set up, you should be able to resolve the issue. If the problem persists, reviewing the related open issue might provide additional insights or updates on a potential fix. |
Beta Was this translation helpful? Give feedback.
-
@dosu I have added |
Beta Was this translation helpful? Give feedback.
-
I've resolved this myself, after tracking it properly and storing every state and comparing i figured out that due to the param |
Beta Was this translation helpful? Give feedback.
I've resolved this myself, after tracking it properly and storing every state and comparing i figured out that due to the param
n_messages=5
previous messages gets cleared due to which any type of message likeHumanMessage
,AIMessage
andToolMessage
can become first butgemini
needs messages to be started withHumanMessage
and for tool calling we need fucntion call messages as well. So because of all these i think there was the error and now everything working fine without that params.