You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/flow/ai-support/index.adoc
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,15 +28,15 @@ ifdef::flow[]
28
28
The AI support module consists of three parts:
29
29
30
30
* **Orchestrator** -- [classname]`AIOrchestrator` is a non-visual coordination engine that connects UI components to an LLM provider. It has no DOM element and should not be added to a layout.
31
-
* **Provider** -- [classname]`LLMProvider` is the interface for communicating with LLM frameworks. Use the static [methodname]`LLMProvider.from(...)` factory methods to create a provider from a Spring AI or LangChain4j model. You can also implement this interface to connect to any other LLM framework.
31
+
* **Provider** -- [classname]`LLMProvider` is the interface for communicating with LLM frameworks. Create a provider by instantiating [classname]`SpringAILLMProvider` or [classname]`LangChain4JLLMProvider` directly. You can also implement this interface to connect to any other LLM framework.
32
32
* **Component interfaces** -- [classname]`AIInput`, [classname]`AIMessageList`, [classname]`AIMessage`, and [classname]`AIFileReceiver` define contracts for UI components that the orchestrator can work with. The builder also accepts standard Vaadin components ([classname]`MessageInput`, [classname]`MessageList`, [classname]`UploadManager`, [classname]`Upload`) directly.
33
33
34
34
Add the UI components to your layout and pass them to the orchestrator through its builder. The orchestrator wires them together and manages the LLM interaction.
35
35
36
36
37
37
== Basic Usage
38
38
39
-
Create an AI Orchestrator by passing an LLM provider and optional UI components to the builder. The following example uses a mock provider -- replace it with a real provider via [methodname]`LLMProvider.from(...)` in production (see <<llm-providers#,LLM Providers>>).
39
+
Create an AI Orchestrator by passing an LLM provider and optional UI components to the builder. The following example uses a mock provider -- replace it with a real provider such as [classname]`SpringAILLMProvider` or [classname]`LangChain4JLLMProvider` in production (see <<llm-providers#,LLM Providers>>).
Copy file name to clipboardExpand all lines: articles/flow/ai-support/llm-providers.adoc
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ order: 20
10
10
11
11
ifdef::flow[]
12
12
13
-
The orchestrator uses the [classname]`LLMProvider` interface to communicate with LLM frameworks. Use the static [methodname]`LLMProvider.from(...)` factory methods to create a provider from a vendor-specific model or client object. Two implementations are provided: one for Spring AI and one for LangChain4j.
13
+
The orchestrator uses the [classname]`LLMProvider` interface to communicate with LLM frameworks. Create a provider by instantiating the appropriate implementation directly. Two implementations are provided: one for Spring AI and one for LangChain4j.
14
14
15
15
.Memory Window Limit
16
16
[IMPORTANT]
@@ -26,12 +26,12 @@ Both built-in providers maintain a 30-message memory window. Older messages are
26
26
// From ChatModel - use an implementation of Spring AI ChatModel
SpringAILLMProvider provider = new SpringAILLMProvider(chatClient);
35
35
----
36
36
37
37
When created from a [classname]`ChatModel`, the provider manages its own conversation memory using a 30-message window. When created from a [classname]`ChatClient`, memory must be configured externally on the client.
@@ -45,24 +45,24 @@ provider.setStreaming(false);
45
45
46
46
.History Restoration with ChatClient
47
47
[NOTE]
48
-
History restoration via [methodname]`withHistory()` is only supported when creating the provider from a [classname]`ChatModel`. Providers created from a [classname]`ChatClient` do not provide access to internal memory, so calling [methodname]`setHistory()` throws an [classname]`UnsupportedOperationException`. Use [methodname]`LLMProvider.from(chatModel)` if you need to restore conversation history across sessions.
48
+
History restoration via [methodname]`withHistory()` is only supported when creating the provider from a [classname]`ChatModel`. Providers created from a [classname]`ChatClient` do not provide access to internal memory, so calling [methodname]`setHistory()` throws an [classname]`UnsupportedOperationException`. Use `new SpringAILLMProvider(chatModel)` if you need to restore conversation history across sessions.
49
49
50
50
51
51
== LangChain4j
52
52
53
-
[classname]`LangChain4JLLMProvider` supports both streaming and synchronous LangChain4j models. The mode is determined by the model type passed to the factory method:
53
+
[classname]`LangChain4JLLMProvider` supports both streaming and synchronous LangChain4j models. The mode is determined by the model type passed to the constructor:
54
54
55
55
[source,java]
56
56
----
57
57
// Streaming mode - use an implementation of LangChain4j StreamingChatModel
0 commit comments