-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Fix a FunctionCallback inside a container that pollutes the global model's ChatOptions #1096
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…del's ChatOptions > The SpingBoot autoconfiguration class adds the container's FunctionCallback directly to the model's ChatOptions, which results in the FunctionCallback being included in the request each time it is called. > The modification registers the container's FunctionCallback directly to the model's functionCallbackRegister.
IMO, we need to add a
Inside the AbstractToolCallSupport constructor. What do you think? |
AbstractToolCallSupport: public void registerFunctionCallback(List<FunctionCallback> functionCallbacks) {
Map<String, FunctionCallback> toolFunctionCallbackMap = functionCallbacks.stream()
.collect(Collectors.toMap(FunctionCallback::getName, Function.identity(), (a, b) -> b));
this.functionCallbackRegister.putAll(toolFunctionCallbackMap);
} AutoConfiguration: //....
if (!CollectionUtils.isEmpty(toolFunctionCallbacks)) {
chatModel.registerFunctionCallback(toolFunctionCallbacks);
}
//.... And I think it's better to remove
|
@tzolov Hello, in order to verify my suspicions, I wrote a demo program to try to reproduce the problem and found that it only occurs when using @EnabledIfEnvironmentVariable(named = "OPENAI_API_KEY", matches = ".*")
public class FunctionCallbackPollutesTest {
private final Logger logger = LoggerFactory.getLogger(FunctionCallbackPollutesTest.class);
private final ApplicationContextRunner contextRunner = new ApplicationContextRunner()
.withPropertyValues("spring.ai.openai.apiKey=" + System.getenv("OPENAI_API_KEY"))
.withPropertyValues("spring.ai.openai.baseUrl=" + System.getenv("OPENAI_BASE_URL"))
.withBean("MockWeatherService", FunctionCallback.class,()->
FunctionCallbackWrapper.builder(new MockWeatherService())
.withName("MockWeatherService")
.withDescription("Get the weather in location")
.withResponseConverter((response) -> "" + response.temp() + response.unit())
.build())
.withBean(MockWeatherService.class)
.withConfiguration(AutoConfigurations.of(OpenAiAutoConfiguration.class));
@Test
void functionCallTest() {
contextRunner
.withPropertyValues("spring.ai.openai.chat.options.model=gpt-4o",
"spring.ai.openai.chat.options.temperature=0.1")
.run(context -> {
OpenAiChatModel chatModel = context.getBean(OpenAiChatModel.class);
UserMessage userMessage = new UserMessage(
"What's the weather like in San Francisco, Tokyo, and Paris?");
var promptOptions = OpenAiChatOptions.builder()
.build();
ChatResponse response = chatModel.call(new Prompt(List.of(userMessage), promptOptions));
logger.info("Response: {}", response.getResult().getOutput().getContent());
});
}
} logResult:
ChatClient: contextRunner
.withPropertyValues("spring.ai.openai.chat.options.model=gpt-4o","spring.ai.openai.chat.options.temperature=0.1")
.run(context -> {
OpenAiChatModel chatModel = context.getBean(OpenAiChatModel.class);
String textContent = "What's the weather like in San Francisco, Tokyo, and Paris?";
ChatClient client = ChatClient.builder(chatModel).build();
String content = client.prompt().user(textContent).call().content();
logger.info("Response: {}", content);
}); logResult:
The reason for this is that the if (!CollectionUtils.isEmpty(toolFunctionCallbacks)) {
chatProperties.getOptions().getFunctionCallbacks().addAll(toolFunctionCallbacks);
}
|
Afterwards, I debugged carefully and found that the reason the above issue didn't occur when using options.getFunctionCallbacks().stream().forEach(functionCallback -> {
// Register the tool callback.
if (isRuntimeCall) {
this.functionCallbackRegister.put(functionCallback.getName(), functionCallback);
}
else {
this.functionCallbackRegister.putIfAbsent(functionCallback.getName(), functionCallback);
}
// Automatically enable the function, usually from prompt callback.
if (isRuntimeCall) {
functionToCall.add(functionCallback.getName());
}
}); So that following calls to the At this point, I'm thinking: what exactly does the Found that in the source code it is if (prompt.getOptions() != null) {
// ...
Set<String> promptEnabledFunctions = this.handleFunctionCallbackConfigurations(updatedRuntimeOptions,
IS_RUNTIME_CALL);
// ...
}
if (this.defaultOptions != null) {
// ...
Set<String> defaultEnabledFunctions = this.handleFunctionCallbackConfigurations(this.defaultOptions,
!IS_RUNTIME_CALL);
// ...
} So I'd like to ask: what is the special consideration for the And I think this leads to a new bug, due to the following code: // Register the tool callback.
if (isRuntimeCall) {
this.functionCallbackRegister.put(functionCallback.getName(), functionCallback);
}
else {
this.functionCallbackRegister.putIfAbsent(functionCallback.getName(), functionCallback);
} When @EnabledIfEnvironmentVariable(named = "OPENAI_API_KEY", matches = ".*")
public class FunctionCallbackPollutesTest {
private final Logger logger = LoggerFactory.getLogger(FunctionCallbackInPromptIT.class);
private final ApplicationContextRunner contextRunner = new ApplicationContextRunner()
.withPropertyValues("spring.ai.openai.apiKey=" + System.getenv("OPENAI_API_KEY"))
.withPropertyValues("spring.ai.openai.baseUrl=" + System.getenv("OPENAI_BASE_URL"))
.withBean("MockWeatherService", FunctionCallback.class,()->
FunctionCallbackWrapper.builder(new MockWeatherService())
.withName("MockWeatherService")
.withDescription("Get the weather in location")
.withResponseConverter((response) -> "" + response.temp() + response.unit())
.build())
.withBean(MockWeatherService.class)
.withConfiguration(AutoConfigurations.of(OpenAiAutoConfiguration.class));
@Test
void functionCallTest() {
contextRunner
.withPropertyValues("spring.ai.openai.chat.options.model=gpt-4o",
"spring.ai.openai.chat.options.temperature=0.1")
.run(context -> {
OpenAiChatModel chatModel = context.getBean(OpenAiChatModel.class);
FunctionCallback callback = FunctionCallbackWrapper.builder(new MockWeatherService())
.withName("MockWeatherService")
.withDescription("Get the weather in location")
.withResponseConverter((response) -> "I dont know")
.build();
UserMessage userMessage = new UserMessage(
"What's the weather like in San Francisco, Tokyo, and Paris?");
var promptOptions = OpenAiChatOptions.builder()
.withFunctionCallbacks(List.of(callback))
.build();
ChatResponse response = chatModel.call(new Prompt(List.of(userMessage), promptOptions));
logger.info("Response: {}", response.getResult().getOutput().getContent());
var promptOptions2 = OpenAiChatOptions.builder()
.withFunctions(Set.of("MockWeatherService"))
.build();
ChatResponse response2 = chatModel.call(new Prompt(List.of(userMessage), promptOptions2));
logger.info("Response: {}", response2.getResult().getOutput().getContent());
});
}
} logResult:
If you remove the first call you can use the // var promptOptions = OpenAiChatOptions.builder()
// .withFunctionCallbacks(List.of(callback))
// .build();
// ChatResponse response = chatModel.call(new Prompt(List.of(userMessage), promptOptions));
// logger.info("Response: {}", response.getResult().getOutput().getContent());
var promptOptions2 = OpenAiChatOptions.builder()
.withFunctions(Set.of("MockWeatherService"))
.build();
ChatResponse response2 = chatModel.call(new Prompt(List.of(userMessage), promptOptions2));
logger.info("Response: {}", response2.getResult().getOutput().getContent()); logResult:
In this case (including concurrent calls) a
Finally this is where the Then, if we remove this parameter, we'll get the bug I guessed at the beginning. What do you think? |
Fixing a FunctionCallback inside a container that pollutes the global model's ChatOptions
function:
input: