Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -278,7 +278,7 @@ public static McpServerFeatures.AsyncToolSpecification toAsyncToolSpecification(
* </ul>
* @param toolCallback the Spring AI tool callback to convert
* @param mimeType the MIME type of the output content
* @return an MCP asynchronous tool specificaiotn that wraps the tool callback
* @return an MCP asynchronous tool specification that wraps the tool callback
* @see McpServerFeatures.AsyncToolSpecification
* @see Schedulers#boundedElastic()
*/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,7 @@ Flux<String> output = chatClient.prompt()
You can also stream the `ChatResponse` using the method `Flux<ChatResponse> chatResponse()`.

In the future, we will offer a convenience method that will let you return a Java entity with the reactive `stream()` method.
In the meantime, you should use the xref:api/structured-output-converter.adoc#StructuredOutputConverter[Structured Output Converter] to convert the aggregated response explicity as shown below.
In the meantime, you should use the xref:api/structured-output-converter.adoc#StructuredOutputConverter[Structured Output Converter] to convert the aggregated response explicitly as shown below.
This also demonstrates the use of parameters in the fluent API that will be discussed in more detail in a later section of the documentation.

[source,java]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ The prefix `spring.ai.azure.openai` is the property prefix to configure the conn
| spring.ai.azure.openai.endpoint | The endpoint from the Azure AI OpenAI `Keys and Endpoint` section under `Resource Management` | -
| spring.ai.azure.openai.openai-api-key | (non Azure) OpenAI API key. Used to authenticate with the OpenAI service, instead of Azure OpenAI.
This automatically sets the endpoint to https://api.openai.com/v1. Use either `api-key` or `openai-api-key` property.
With this configuration the `spring.ai.azure.openai.embedding.options.deployment-name` is threated as an https://platform.openai.com/docs/models[OpenAi Model] name.| -
With this configuration the `spring.ai.azure.openai.embedding.options.deployment-name` is treated as an https://platform.openai.com/docs/models[OpenAi Model] name.| -
|====


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ They measure the time spent in the advisor (including the time spend on the inne
== Chat Model

NOTE: Observability features are currently supported only for `ChatModel` implementations from the following AI model
providers: Anthropic, Azure OpenAI, Mistral AI, Ollama, OpenAI, Vertex AI, MiniMax, Moonshot, QianFan, Zhiu AI.
providers: Anthropic, Azure OpenAI, Mistral AI, Ollama, OpenAI, Vertex AI, MiniMax, Moonshot, QianFan, Zhipu AI.
Additional AI model providers will be supported in a future release.

The `gen_ai.client.operation` observations are recorded when calling the ChatModel `call` or `stream` methods.
Expand Down