-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Closed
Closed
Copy link
Labels
Milestone
Description
Bug description
I have two functions that I have specified in Chat Options. I can see from the logs that these functions are getting invoked by the LLM, but when I try to retrieve the function details using the getToolCalls()
method they are not being returned (even when the functions have been invoked by the LLM).
Code snippet -
Prompt prompt = new Prompt(List.of(systemMessage, userMessage),
OpenAiChatOptions.builder().withTemperature(0.7f)
.withModel("gpt-4o")
.withFunction("findPapers")
.withFunction("summarizePaper")
.withParallelToolCalls(false)
.build());
Flux<ChatResponse> chatResponseStream = chatModel.stream(prompt);
chatResponseStream.map(response -> response.getResult().getOutput().getToolCalls())
.doOnNext(toolCalls -> {
logger.info("Tool calls: {}", toolCalls); // returns an empty list when the function has actually been invoked
}
})
.onErrorContinue((e, o) -> logger.error("Error occurred while processing chat response",
e))
.subscribe();
Environment
Spring AI version - 1.0.0-M2
Steps to reproduce
Please see sample code above for the steps.
Expected behavior
Expecting the getTools()
method to return the list of functions invoked by the LLM.