|
| 1 | += Bedrock Converse API |
| 2 | + |
| 3 | +link:https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference.html[Amazon Bedrock] Converse API provides a unified interface for conversational AI models with enhanced capabilities including function/tool calling, multimodal inputs, and streaming responses. |
| 4 | + |
| 5 | +The Bedrock Converse API has the following high-level features: |
| 6 | + |
| 7 | +* Tool/Function Calling: Support for function definitions and tool use during conversations |
| 8 | +* Multimodal Input: Ability to process both text and image inputs in conversations |
| 9 | +* Streaming Support: Real-time streaming of model responses |
| 10 | +* System Messages: Support for system-level instructions and context setting |
| 11 | +* Metrics Integration: Built-in support for observation and metrics tracking |
| 12 | +
|
| 13 | +The https://docs.aws.amazon.com/bedrock/latest/userguide/what-is-bedrock.html[Amazon Bedrock User Guide] contains detailed information on how to use the AWS hosted service. |
| 14 | + |
| 15 | +TIP: The Bedrock Converse API provides a unified interface across multiple model providers while handling AWS-specific authentication and infrastructure concerns. |
| 16 | + |
| 17 | +== Prerequisites |
| 18 | + |
| 19 | +Refer to the xref:api/bedrock.adoc[Spring AI documentation on Amazon Bedrock] for setting up API access. |
| 20 | + |
| 21 | +* Obtain AWS credentials: If you don't have an AWS account and AWS CLI configured yet, this video guide can help you configure it: link:https://youtu.be/gswVHTrRX8I?si=buaY7aeI0l3-bBVb[AWS CLI & SDK Setup in Less Than 4 Minutes!]. You should be able to obtain your access and security keys. |
| 22 | + |
| 23 | +* Enable the Models to use: Go to link:https://us-east-1.console.aws.amazon.com/bedrock/home[Amazon Bedrock] and from the link:https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/modelaccess[Model Access] menu on the left, configure access to the models you are going to use. |
| 24 | + |
| 25 | + |
| 26 | +== Auto-configuration |
| 27 | + |
| 28 | +Add the `spring-ai-bedrock-converse-spring-boot-starter` dependency to your project's Maven `pom.xml` or Gradle `build.gradle` build files: |
| 29 | + |
| 30 | +[tabs] |
| 31 | +====== |
| 32 | +Maven:: |
| 33 | ++ |
| 34 | +[source,xml] |
| 35 | +---- |
| 36 | +<dependency> |
| 37 | + <groupId>org.springframework.ai</groupId> |
| 38 | + <artifactId>spring-ai-bedrock-converse-spring-boot-starter</artifactId> |
| 39 | +</dependency> |
| 40 | +---- |
| 41 | +
|
| 42 | +Gradle:: |
| 43 | ++ |
| 44 | +[source,gradle] |
| 45 | +---- |
| 46 | +dependencies { |
| 47 | + implementation 'org.springframework.ai:spring-ai-bedrock-converse-spring-boot-starter' |
| 48 | +} |
| 49 | +---- |
| 50 | +====== |
| 51 | + |
| 52 | +TIP: Refer to the xref:getting-started.adoc#dependency-management[Dependency Management] section to add the Spring AI BOM to your build file. |
| 53 | + |
| 54 | + |
| 55 | +=== Chat Properties |
| 56 | + |
| 57 | +The prefix `spring.ai.bedrock.aws` is the property prefix to configure the connection to AWS Bedrock. |
| 58 | + |
| 59 | +[cols="3,3,1", stripes=even] |
| 60 | +|==== |
| 61 | +| Property | Description | Default |
| 62 | + |
| 63 | +| spring.ai.bedrock.aws.region | AWS region to use. | us-east-1 |
| 64 | +| spring.ai.bedrock.aws.timeout | AWS timeout to use. | 5m |
| 65 | +| spring.ai.bedrock.aws.access-key | AWS access key. | - |
| 66 | +| spring.ai.bedrock.aws.secret-key | AWS secret key. | - |
| 67 | +| spring.ai.bedrock.aws.session-token | AWS session token for temporary credentials. | - |
| 68 | +|==== |
| 69 | + |
| 70 | +The prefix `spring.ai.bedrock.converse.chat` is the property prefix that configures the chat model implementation for the Converse API. |
| 71 | + |
| 72 | +[cols="3,5,1", stripes=even] |
| 73 | +|==== |
| 74 | +| Property | Description | Default |
| 75 | + |
| 76 | +| spring.ai.bedrock.converse.chat.enabled | Enable Bedrock Converse chat model. | true |
| 77 | +| spring.ai.bedrock.converse.chat.options.model | The model ID to use. You can use the https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html[Supported models and model features] | anthropic.claude-3-sonnet-20240229-v1:0 |
| 78 | +| spring.ai.bedrock.converse.chat.options.temperature | Controls the randomness of the output. Values can range over [0.0,1.0] | 0.8 |
| 79 | +| spring.ai.bedrock.converse.chat.options.top-p | The maximum cumulative probability of tokens to consider when sampling. | AWS Bedrock default |
| 80 | +| spring.ai.bedrock.converse.chat.options.top-k | Number of token choices for generating the next token. | AWS Bedrock default |
| 81 | +| spring.ai.bedrock.converse.chat.options.max-tokens | Maximum number of tokens in the generated response. | 500 |
| 82 | +|==== |
| 83 | + |
| 84 | +== Runtime Options [[chat-options]] |
| 85 | + |
| 86 | +Use the portable `ChatOptions` or `FunctionCallingOptions` portable builders to create model configurations, such as temperature, maxToken, topP, etc. |
| 87 | + |
| 88 | +On start-up, the default options can be configured with the `BedrockConverseProxyChatModel(api, options)` constructor or the `spring.ai.bedrock.converse.chat.options.*` properties. |
| 89 | + |
| 90 | +At run-time you can override the default options by adding new, request specific, options to the `Prompt` call: |
| 91 | + |
| 92 | +[source,java] |
| 93 | +---- |
| 94 | +var options = FunctionCallingOptions.builder() |
| 95 | + .withModel("anthropic.claude-3-5-sonnet-20240620-v1:0") |
| 96 | + .withTemperature(0.6) |
| 97 | + .withMaxTokens(300) |
| 98 | + .withFunctionCallbacks(List.of(FunctionCallbackWrapper.builder(new WeatherService()) |
| 99 | + .withName("getCurrentWeather") |
| 100 | + .withDescription("Get the weather in location. Return temperature in 36°F or 36°C format. Use multi-turn if needed.") |
| 101 | + .build())) |
| 102 | + .build(); |
| 103 | +
|
| 104 | +ChatResponse response = chatModel.call(new Prompt("What is current weather in Amsterdam?", options)); |
| 105 | +---- |
| 106 | + |
| 107 | +== Tool/Function Calling |
| 108 | + |
| 109 | +The Bedrock Converse API supports function calling capabilities, allowing models to use tools during conversations. Here's an example of how to define and use functions: |
| 110 | + |
| 111 | +[source,java] |
| 112 | +---- |
| 113 | +@Bean |
| 114 | +@Description("Get the weather in location. Return temperature in 36°F or 36°C format.") |
| 115 | +public Function<Request, Response> weatherFunction() { |
| 116 | + return new MockWeatherService(); |
| 117 | +} |
| 118 | +
|
| 119 | +String response = ChatClient.create(this.chatModel) |
| 120 | + .prompt("What's the weather like in Boston?") |
| 121 | + .function("weatherFunction") |
| 122 | + .call() |
| 123 | + .content(); |
| 124 | +---- |
| 125 | + |
| 126 | +== Sample Controller |
| 127 | + |
| 128 | +Create a new Spring Boot project and add the `spring-ai-bedrock-converse-spring-boot-starter` to your dependencies. |
| 129 | + |
| 130 | +Add an `application.properties` file under `src/main/resources`: |
| 131 | + |
| 132 | +[source,properties] |
| 133 | +---- |
| 134 | +spring.ai.bedrock.aws.region=eu-central-1 |
| 135 | +spring.ai.bedrock.aws.timeout=10m |
| 136 | +spring.ai.bedrock.aws.access-key=${AWS_ACCESS_KEY_ID} |
| 137 | +spring.ai.bedrock.aws.secret-key=${AWS_SECRET_ACCESS_KEY} |
| 138 | +
|
| 139 | +spring.ai.bedrock.converse.chat.options.temperature=0.8 |
| 140 | +spring.ai.bedrock.converse.chat.options.top-k=15 |
| 141 | +---- |
| 142 | + |
| 143 | +Here's an example controller using the chat model: |
| 144 | + |
| 145 | +[source,java] |
| 146 | +---- |
| 147 | +@RestController |
| 148 | +public class ChatController { |
| 149 | +
|
| 150 | + private final ChatClient chatClient; |
| 151 | +
|
| 152 | + @Autowired |
| 153 | + public ChatController(ChatClient.Builder builder) { |
| 154 | + this.chatClient = builder.build(); |
| 155 | + } |
| 156 | +
|
| 157 | + @GetMapping("/ai/generate") |
| 158 | + public Map generate(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) { |
| 159 | + return Map.of("generation", this.chatClient.prompt(message).call().content()); |
| 160 | + } |
| 161 | +
|
| 162 | + @GetMapping("/ai/generateStream") |
| 163 | + public Flux<ChatResponse> generateStream(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) { |
| 164 | + return this.chatClient.prompt(message).stream().content(); |
| 165 | + } |
| 166 | +} |
| 167 | +---- |
| 168 | + |
0 commit comments