Skip to content

Commit 1d3314b

Browse files
authored
docs: remove llmprovider factory methods (#5383)
1 parent ac2687e commit 1d3314b

File tree

3 files changed

+11
-11
lines changed

3 files changed

+11
-11
lines changed

articles/building-apps/ai/quickstart-guide.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,7 @@ package org.vaadin.example;
121121
122122
import com.vaadin.flow.component.Composite;
123123
import com.vaadin.flow.component.ai.orchestrator.AIOrchestrator;
124-
import com.vaadin.flow.component.ai.provider.LLMProvider;
124+
import com.vaadin.flow.component.ai.provider.SpringAILLMProvider;
125125
import com.vaadin.flow.component.messages.MessageInput;
126126
import com.vaadin.flow.component.messages.MessageList;
127127
import com.vaadin.flow.component.orderedlayout.VerticalLayout;
@@ -146,7 +146,7 @@ public class MainView extends Composite<VerticalLayout> {
146146
messageInput.setWidthFull();
147147
148148
// Create the LLM provider
149-
var provider = LLMProvider.from(chatModel);
149+
var provider = new SpringAILLMProvider(chatModel);
150150
151151
// Wire everything together
152152
AIOrchestrator.builder(provider,

articles/flow/ai-support/index.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,15 +28,15 @@ ifdef::flow[]
2828
The AI support module consists of three parts:
2929

3030
* **Orchestrator** -- [classname]`AIOrchestrator` is a non-visual coordination engine that connects UI components to an LLM provider. It has no DOM element and should not be added to a layout.
31-
* **Provider** -- [classname]`LLMProvider` is the interface for communicating with LLM frameworks. Use the static [methodname]`LLMProvider.from(...)` factory methods to create a provider from a Spring AI or LangChain4j model. You can also implement this interface to connect to any other LLM framework.
31+
* **Provider** -- [classname]`LLMProvider` is the interface for communicating with LLM frameworks. Create a provider by instantiating [classname]`SpringAILLMProvider` or [classname]`LangChain4JLLMProvider` directly. You can also implement this interface to connect to any other LLM framework.
3232
* **Component interfaces** -- [classname]`AIInput`, [classname]`AIMessageList`, [classname]`AIMessage`, and [classname]`AIFileReceiver` define contracts for UI components that the orchestrator can work with. The builder also accepts standard Vaadin components ([classname]`MessageInput`, [classname]`MessageList`, [classname]`UploadManager`, [classname]`Upload`) directly.
3333

3434
Add the UI components to your layout and pass them to the orchestrator through its builder. The orchestrator wires them together and manages the LLM interaction.
3535

3636

3737
== Basic Usage
3838

39-
Create an AI Orchestrator by passing an LLM provider and optional UI components to the builder. The following example uses a mock provider -- replace it with a real provider via [methodname]`LLMProvider.from(...)` in production (see <<llm-providers#,LLM Providers>>).
39+
Create an AI Orchestrator by passing an LLM provider and optional UI components to the builder. The following example uses a mock provider -- replace it with a real provider such as [classname]`SpringAILLMProvider` or [classname]`LangChain4JLLMProvider` in production (see <<llm-providers#,LLM Providers>>).
4040

4141
[.example.show-code]
4242
--

articles/flow/ai-support/llm-providers.adoc

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ order: 20
1010

1111
ifdef::flow[]
1212

13-
The orchestrator uses the [classname]`LLMProvider` interface to communicate with LLM frameworks. Use the static [methodname]`LLMProvider.from(...)` factory methods to create a provider from a vendor-specific model or client object. Two implementations are provided: one for Spring AI and one for LangChain4j.
13+
The orchestrator uses the [classname]`LLMProvider` interface to communicate with LLM frameworks. Create a provider by instantiating the appropriate implementation directly. Two implementations are provided: one for Spring AI and one for LangChain4j.
1414

1515
.Memory Window Limit
1616
[IMPORTANT]
@@ -26,12 +26,12 @@ Both built-in providers maintain a 30-message memory window. Older messages are
2626
// From ChatModel - use an implementation of Spring AI ChatModel
2727
ChatModel chatModel = OpenAiChatModel.builder()
2828
.openAiApi(...).defaultOptions(...).build();
29-
SpringAILLMProvider provider = LLMProvider.from(chatModel);
29+
SpringAILLMProvider provider = new SpringAILLMProvider(chatModel);
3030
3131
// From ChatClient - use a Spring AI ChatClient
3232
ChatClient chatClient = ChatClient.builder(...)
3333
.defaultAdvisors(...).build();
34-
SpringAILLMProvider provider = LLMProvider.from(chatClient);
34+
SpringAILLMProvider provider = new SpringAILLMProvider(chatClient);
3535
----
3636

3737
When created from a [classname]`ChatModel`, the provider manages its own conversation memory using a 30-message window. When created from a [classname]`ChatClient`, memory must be configured externally on the client.
@@ -45,24 +45,24 @@ provider.setStreaming(false);
4545

4646
.History Restoration with ChatClient
4747
[NOTE]
48-
History restoration via [methodname]`withHistory()` is only supported when creating the provider from a [classname]`ChatModel`. Providers created from a [classname]`ChatClient` do not provide access to internal memory, so calling [methodname]`setHistory()` throws an [classname]`UnsupportedOperationException`. Use [methodname]`LLMProvider.from(chatModel)` if you need to restore conversation history across sessions.
48+
History restoration via [methodname]`withHistory()` is only supported when creating the provider from a [classname]`ChatModel`. Providers created from a [classname]`ChatClient` do not provide access to internal memory, so calling [methodname]`setHistory()` throws an [classname]`UnsupportedOperationException`. Use `new SpringAILLMProvider(chatModel)` if you need to restore conversation history across sessions.
4949

5050

5151
== LangChain4j
5252

53-
[classname]`LangChain4JLLMProvider` supports both streaming and synchronous LangChain4j models. The mode is determined by the model type passed to the factory method:
53+
[classname]`LangChain4JLLMProvider` supports both streaming and synchronous LangChain4j models. The mode is determined by the model type passed to the constructor:
5454

5555
[source,java]
5656
----
5757
// Streaming mode - use an implementation of LangChain4j StreamingChatModel
5858
StreamingChatModel streamingChatModel = OpenAiStreamingChatModel.builder()
5959
.apiKey(...).modelName(...).build();
60-
LangChain4JLLMProvider provider = LLMProvider.from(streamingChatModel);
60+
LangChain4JLLMProvider provider = new LangChain4JLLMProvider(streamingChatModel);
6161
6262
// Synchronous mode - use an implementation of LangChain4j ChatModel
6363
ChatModel chatModel = OpenAiChatModel.builder()
6464
.apiKey(...).modelName(...).build();
65-
LangChain4JLLMProvider provider = LLMProvider.from(chatModel);
65+
LangChain4JLLMProvider provider = new LangChain4JLLMProvider(chatModel);
6666
----
6767

6868
The provider manages its own conversation memory using a 30-message window.

0 commit comments

Comments
 (0)