You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/prompt-engineering-patterns.adoc
+6-9Lines changed: 6 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,8 +2,8 @@
2
2
= Prompt Engineering Patterns
3
3
4
4
Practical implementations of Prompt Engineering techniques based on the comprehensive link:https://www.kaggle.com/whitepaper-prompt-engineering[Prompt Engineering Guide].
5
-
The guide covers the theory, principles, and patterns of effective prompt engineering, while here we demosntrate how to translate those concepts into working Java code using Spring AI's fluent link:https://docs.spring.io/spring-ai/reference/api/chatclient.html[ChatClient API].
6
-
The demo source code used in this article is available at: https://github.com/spring-projects/spring-ai-examples/tree/main/prompt-engineering/prompt-engineering-patterns
5
+
The guide covers the theory, principles, and patterns of effective prompt engineering, while here we demosntrate how to translate those concepts into working Java code using Spring AI's fluent xref::api/chatclient.adoc[ChatClient API].
6
+
The demo source code used in this article is available at: link:https://github.com/spring-projects/spring-ai-examples/tree/main/prompt-engineering/prompt-engineering-patterns[Prompt Engineering Patterns Examples].
7
7
8
8
== 1. Configuration
9
9
@@ -13,7 +13,7 @@ It covers selecting the right LLM provider for your use case and configuring imp
13
13
=== LLM Provider Selection
14
14
15
15
For prompt engineering, you will start by choosing a model.
16
-
Spring AI supports link:https://docs.spring.io/spring-ai/reference/api/chat/comparison.html[multiple LLM providers] (such as OpenAI, Anthropic, Google Vertex AI, AWS Bedrock, Ollama and more), letting you switch providers without changing application code - just update your configuration.
16
+
Spring AI supports xref::api/chat/comparison.adoc[multiple LLM providers] (such as OpenAI, Anthropic, Google Vertex AI, AWS Bedrock, Ollama and more), letting you switch providers without changing application code - just update your configuration.
17
17
Just add the selected starter dependency `spring-ai-starter-model-<MODEL-PROVIDER-NAME>`.
18
18
For example, here is how to enable Anthropic Claude API:
19
19
@@ -25,10 +25,7 @@ For example, here is how to enable Anthropic Claude API:
Before we dive into prompt engineering techniques, it's essential to understand how to configure the LLM's output behavior. Spring AI provides several configuration options that let you control various aspects of generation through the link:https://docs.spring.io/spring-ai/reference/api/chatmodel.html#_chat_options[ChatOptions] builder.
43
+
Before we dive into prompt engineering techniques, it's essential to understand how to configure the LLM's output behavior. Spring AI provides several configuration options that let you control various aspects of generation through the xref:/api/chatmodel.adoc#_chat_options[ChatOptions] builder.
47
44
48
45
All configurations can be applied programmatically as demonstrated in the examples below or through Spring application properties at start time.
0 commit comments