chore: [OpenAI] (PoC) Generated model classes#266
Conversation
# Conflicts: # foundation-models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClient.java
…-openai # Conflicts: # foundation-models/openai/src/main/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiChatCompletionDelta.java # foundation-models/openai/src/test/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientTest.java
| return new ClientStreamingHandler<>(deltaType, OpenAiError.class, OpenAiClientException::new) | ||
| .objectMapper( | ||
| JACKSON | ||
| .addMixIn( | ||
| CreateChatCompletionResponse.class, | ||
| JacksonMixins.CreateChatCompletionResponseMixIn.class) | ||
| .addMixIn( | ||
| CreateChatCompletionStreamResponse.class, | ||
| JacksonMixins.CreateChatCompletionStreamResponseMixIn.class) | ||
| .addMixIn( | ||
| ChatCompletionsCreate200Response.class, | ||
| JacksonMixins.ChatCompletionCreate200ResponseMixIn.class)) | ||
| .handleStreamingResponse(client.executeOpen(null, request, null)); |
There was a problem hiding this comment.
(Comment)
Workaround because Jackson is unable to determine deserializer by itself for
- ChatCompletionsCreate200Response
- CreateChatCompletionStreamResponse extends *
- CreateChatCompletionResponse extends *
There was a problem hiding this comment.
(Question)
Maybe I missed the problem. But, I don't see the reason why deduction would not work as there is an additional property prompt_filter_results in CreateChatCompletionResponse that is not present in the other.
According to my understanding, as long as there is a distinguishing property, Jackson should be able the resolve the subtype.
There was a problem hiding this comment.
don't see the reason why deduction would not work
The actual payload for streamed responses differs significantly from the specification; it is inconsistent. Whatever annotation / setting we choose, it will be a compromise. If you find deduction working, or if you prefer a different mixin - please be my guest and change it as part of the upcoming convenience API draft.
| @Nonnull | ||
| public OpenAiChatCompletionOutput chatCompletion(@Nonnull final String prompt) | ||
| public CreateChatCompletionResponse chatCompletion(@Nonnull final String prompt) | ||
| throws OpenAiClientException { | ||
| final OpenAiChatCompletionParameters parameters = new OpenAiChatCompletionParameters(); | ||
| final CreateChatCompletionRequest parameters = new CreateChatCompletionRequest(); | ||
|
|
||
| if (systemPrompt != null) { | ||
| parameters.addMessages(new OpenAiChatSystemMessage().setContent(systemPrompt)); | ||
| parameters.addMessagesItem( | ||
| new ChatCompletionRequestSystemMessage() | ||
| .role(ChatCompletionRequestSystemMessage.RoleEnum.SYSTEM) | ||
| .content(ChatCompletionRequestSystemMessageContent.create(systemPrompt))); | ||
| } | ||
| parameters.addMessages(new OpenAiChatUserMessage().addText(prompt)); | ||
| parameters | ||
| .addMessagesItem( | ||
| new ChatCompletionRequestUserMessage() | ||
| .role(ChatCompletionRequestUserMessage.RoleEnum.USER) | ||
| .content(ChatCompletionRequestUserMessageContent.create(prompt))) | ||
| .functions(null) | ||
| .tools(null) | ||
| .parallelToolCalls(null); | ||
| return chatCompletion(parameters); | ||
| } |
There was a problem hiding this comment.
(Comment)
This is the unfortunate diff expected for regular usage.
Without convenience API.
| "messages" : [ { | ||
| "role" : "system", | ||
| "content" : "You are a helpful AI" | ||
| }, { | ||
| "role" : "user", | ||
| "content" : [ { | ||
| "type" : "text", | ||
| "text" : "Hello World! Why is this phrase so famous?" | ||
| } ] | ||
| } ] | ||
| "temperature" : 1, | ||
| "top_p" : 1, | ||
| "stream" : false, | ||
| "presence_penalty" : 0, | ||
| "frequency_penalty" : 0, | ||
| "messages" : [ { | ||
| "content" : "You are a helpful AI", | ||
| "role" : "system" | ||
| }, { | ||
| "content" : "Hello World! Why is this phrase so famous?", | ||
| "role" : "user" | ||
| } ], | ||
| "logprobs" : false, | ||
| "n" : 1, | ||
| "parallel_tool_calls" : true, | ||
| "tools" : [ ], | ||
| "functions" : [ ] |
There was a problem hiding this comment.
(Comment)
Diff in payload being sent.
May require further finetuning to stop default values and/or empty arrays.
| .isEqualTo(new float[] {0.0f, 3.4028235E38f, 1.4E-45f, 1.23f, -4.56f}); | ||
| .containsExactly( | ||
| new BigDecimal("0.0"), | ||
| new BigDecimal("3.4028235E+38"), | ||
| new BigDecimal("1.4E-45"), | ||
| new BigDecimal("1.23"), | ||
| new BigDecimal("-4.56")); |
There was a problem hiding this comment.
(Major/Comment)
Instead of float[] we have List<BigDecimal> for embedding vectors. This needs finetuning.
| OpenAiChatCompletionOutput totalOutput = new OpenAiChatCompletionOutput(); | ||
| final List<OpenAiChatCompletionDelta> deltaList = | ||
| stream.peek(totalOutput::addDelta).toList(); |
There was a problem hiding this comment.
(Major/Comment)
I found the totalOutput handling confusing. I did not write any basic convenience for that, instead I made sure the information can be extracted. With casting and/or custom-field lookups. This needs refinement.
| @Nonnull | ||
| public OpenAiChatCompletionOutput chatCompletionImage(@Nonnull final String linkToImage) { | ||
| public CreateChatCompletionResponse chatCompletionImage(@Nonnull final String linkToImage) { | ||
| final var partText = | ||
| new ChatCompletionRequestMessageContentPartText() | ||
| .type(TEXT) | ||
| .text("Describe the following image."); | ||
| final var partImageUrl = | ||
| new ChatCompletionRequestMessageContentPartImageImageUrl() | ||
| .url(URI.create(linkToImage)) | ||
| .detail(HIGH); | ||
| final var partImage = | ||
| new ChatCompletionRequestMessageContentPartImage().type(IMAGE_URL).imageUrl(partImageUrl); | ||
| final var userMessage = | ||
| new ChatCompletionRequestUserMessage() | ||
| .role(USER) | ||
| .content(ChatCompletionRequestUserMessageContent.create(List.of(partText, partImage))); | ||
| final var request = | ||
| new OpenAiChatCompletionParameters() | ||
| .addMessages( | ||
| new OpenAiChatMessage.OpenAiChatUserMessage() | ||
| .addText("Describe the following image.") | ||
| .addImage( | ||
| linkToImage, | ||
| OpenAiChatMessage.OpenAiChatUserMessage.ImageDetailLevel.HIGH)); | ||
| new CreateChatCompletionRequest() | ||
| .addMessagesItem(userMessage) | ||
| .functions(null) | ||
| .tools(null) | ||
| .parallelToolCalls(null); | ||
|
|
||
| return OpenAiClient.forModel(GPT_4O).chatCompletion(request); | ||
| } |
There was a problem hiding this comment.
(Comment)
Another showcase for OpenAI generated code consumption without convenience.
# Conflicts: # foundation-models/openai/src/test/java/com/sap/ai/sdk/foundationmodels/openai/OpenAiClientTest.java
|
Closed in favor of #322 |
Context
https://github.tools.sap/AI/ai-sdk-java-backlog/issues/129
Generated code for OpenAI specification 2024-10-02
Feature scope:
Definition of Done
Aligned changes with the JavaScript SDKDocumentation updatedRelease notes updated