Skip to content

Commit 4144e80

Browse files
authored
Merge pull request #6968 from syncfusion-content/986912-dialog-blazor-hotfix
986912: resolved Liquid syntax error
2 parents 846ae95 + 276a3d5 commit 4144e80

File tree

12 files changed

+1326
-5
lines changed

12 files changed

+1326
-5
lines changed

blazor-toc.html

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -497,12 +497,19 @@
497497
<ul>
498498
<li><a href="/blazor/ai-assistview/ai-integrations/gemini-integration">Google Gemini</a></li>
499499
<li><a href="/blazor/ai-assistview/ai-integrations/openai-integration">Azure OpenAI</a></li>
500+
<li><a href="/blazor/ai-assistview/ai-integrations/ollama-llm-integration">Ollama LLM</a></li>
500501
</ul>
501502
</li>
502503
<li><a href="/blazor/ai-assistview/toolbar-items">Toolbar items</a></li>
503504
<li><a href="/blazor/ai-assistview/custom-view">Custom views</a></li>
504505
<li><a href="/blazor/ai-assistview/file-attachments">File attachments</a></li>
505506
<li><a href="/blazor/ai-assistview/templates">Templates</a></li>
507+
<li>Speech
508+
<ul>
509+
<li><a href="/blazor/ai-assistview/speech/speech-to-text">Speech to Text</a></li>
510+
<li><a href="/blazor/ai-assistview/speech/text-to-speech">Text to Speech</a></li>
511+
</ul>
512+
</li>
506513
<li><a href="/blazor/ai-assistview/appearance">Appearance</a></li>
507514
<li><a href="/blazor/ai-assistview/accessibility">Accessibility</a></li>
508515
<li><a href="/blazor/ai-assistview/methods">Methods</a></li>
@@ -1489,6 +1496,16 @@
14891496
</ul>
14901497
</li>
14911498
<li><a href="/blazor/chat-ui/messages">Messages</a></li>
1499+
<li>Chat Bot Integrations
1500+
<ul>
1501+
<li>
1502+
<a href="/blazor/chat-ui/bot-integrations/integration-with-bot-dialogflow">Google Dialogflow</a>
1503+
</li>
1504+
<li>
1505+
<a href="/blazor/chat-ui/bot-integrations/integration-with-bot-framework">Microsoft Bot Framework</a>
1506+
</li>
1507+
</ul>
1508+
</li>
14921509
<li><a href="/blazor/chat-ui/timebreak">Time break</a></li>
14931510
<li><a href="/blazor/chat-ui/timestamp">Timestamp</a></li>
14941511
<li><a href="/blazor/chat-ui/typing-indicator">Typing indicator</a></li>

blazor/ai-assistview/ai-integrations/gemini-integration.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,15 +29,15 @@ Follow the Syncfusion AI AssistView [Getting Started](../getting-started) guide
2929

3030
Install the required packages:
3131

32-
1. Install the `Gemini AI` nuget package in the application.
32+
* Install the `Gemini AI` nuget package in the application.
3333

3434
```bash
3535

3636
Nuget\Install-Package Mscc.GenerativeAI
3737

3838
```
3939

40-
2. Install the `Markdig` nuget packages in the application.
40+
* Install the `Markdig` nuget packages in the application.
4141

4242
```bash
4343

@@ -59,7 +59,7 @@ Nuget\Install-Package Markdig
5959
6060
## Gemini AI with AI AssistView
6161

62-
Modify the Razor file to integrate the Gemini AI with the AI AssistView component.
62+
Modify the razor file to integrate the Gemini AI with the AI AssistView component.
6363

6464
* update your Gemini API key securely in the configuration:
6565

Lines changed: 282 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,282 @@
1+
---
2+
layout: post
3+
title: LLM Model with Blazor AI AssistView Component | Syncfusion
4+
description: Checkout and learn about Integration of LLM Model with Blazor AI AssistView component in Blazor WebAssembly Application.
5+
platform: Blazor
6+
control: AI AssistView
7+
documentation: ug
8+
---
9+
10+
# Integrate LLM via Ollama with Blazor AI AssistView Component
11+
12+
The AI AssistView component integrates with [LLM via Ollama](https://ollama.com) to enable advanced conversational AI features in your Blazor application. The component acts as a user interface where user prompts are sent to the selected LLM model via API calls, providing natural language understanding and context-aware responses.
13+
14+
## Prerequisites
15+
16+
Before starting, ensure you have the following:
17+
18+
* [Ollama](https://ollama.com) installed to run and manage LLM models locally.
19+
20+
* **Syncfusion AI AssistView**: Package [Syncfusion Blazor package](https://www.nuget.org/packages/Syncfusion.Blazor.InteractiveChat) installed.
21+
22+
* [Markdig](https://www.nuget.org/packages/Markdig) package: For parsing Markdown responses.
23+
24+
## Set Up the AI AssistView Component
25+
26+
Follow the Syncfusion AI AssistView [Getting Started](../getting-started) guide to configure and render the AI AssistView component in the application and that prerequisites are met.
27+
28+
## Install Dependency
29+
30+
To install the Markdig package by run `NuGet\Install-Package Markdig` in Package Manager Console.
31+
32+
## Configuring Ollama
33+
34+
Install Ollama for your operating system:
35+
36+
{% tabs %}
37+
{% highlight ts tabtitle="Windows" %}
38+
39+
1. Visit [Windows](https://ollama.com/download)
40+
2. Click `Download for Windows` to get the `.exe installer`.
41+
3. Run `OllamaSetup.exe` and follow the wizard to install.
42+
43+
{% endhighlight %}
44+
45+
{% highlight ts tabtitle="macOS" %}
46+
47+
1. Visit [macOS](https://ollama.com/download/mac)
48+
2. Click `Download for macOS` to get `.dmg file`
49+
3. Install it by following the wizard.
50+
51+
{% endhighlight %}
52+
53+
{% highlight ts tabtitle="Linux" %}
54+
55+
1. Visit [Linux](https://ollama.com/download/linux)
56+
2. Run the below command to install Ollama in your system
57+
58+
curl -fsSL https://ollama.com/install.sh | sh
59+
60+
{% endhighlight %}
61+
{% endtabs %}
62+
63+
## Download and run an Ollama model
64+
65+
* Download and run a model using the following command. Replace `deepseek-r1` with your preferred model (e.g., `llama3`, `phi4`). See the [Ollama model](https://ollama.com/search) library for available models.
66+
67+
```bash
68+
69+
ollama run deepseek-r1
70+
71+
```
72+
73+
* After the model download completes, start the Ollama server to make the model accessible:
74+
75+
```bash
76+
77+
ollama serve
78+
79+
```
80+
81+
## Configure AI AssistView with Ollama
82+
83+
To integrate Ollama with the Syncfusion Blazor AI AssistView component in your Blazor application:
84+
85+
* Configure the AI services in the `Program.cs` file to register the Ollama client and Syncfusion Blazor services.
86+
87+
{% tabs %}
88+
{% highlight cs tabtitle="Program.cs" %}
89+
90+
91+
using Blazor_AssistView_Ollama.Components;
92+
using Microsoft.Extensions.Caching.Memory;
93+
using Microsoft.Extensions.AI;
94+
using OllamaSharp;
95+
using Syncfusion.Blazor;
96+
97+
var builder = WebApplication.CreateBuilder(args);
98+
99+
// Add services to the container.
100+
builder.Services.AddRazorComponents()
101+
.AddInteractiveServerComponents();
102+
builder.Services.AddSyncfusionBlazor();
103+
104+
builder.Services.AddHttpClient();
105+
106+
builder.Services.AddDistributedMemoryCache();
107+
108+
// Ollama configuration
109+
builder.Services.AddChatClient(new OllamaApiClient(new Uri("http://localhost:11434/"), "llama3.2"))
110+
.UseDistributedCache()
111+
.UseLogging();
112+
113+
var app = builder.Build();
114+
115+
// Configure the HTTP request pipeline.
116+
if (!app.Environment.IsDevelopment())
117+
{
118+
app.UseExceptionHandler("/Error", createScopeForErrors: true);
119+
// The default HSTS value is 30 days. You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts.
120+
app.UseHsts();
121+
}
122+
123+
app.UseHttpsRedirection();
124+
125+
126+
app.UseAntiforgery();
127+
128+
app.MapStaticAssets();
129+
app.MapRazorComponents<App>()
130+
.AddInteractiveServerRenderMode();
131+
132+
app.Run();
133+
134+
{% endhighlight %}
135+
{% endtabs %}
136+
137+
* Modify the `Index.razor` file (or a dedicated component) to host the integration logic and handle prompt requests.
138+
139+
{% tabs %}
140+
{% highlight razor %}
141+
142+
@rendermode InteractiveServer
143+
@using Markdig
144+
@using Microsoft.Extensions.AI
145+
@using Syncfusion.Blazor.Navigations
146+
147+
<div class="control-section">
148+
<div class="stream-aiassistview">
149+
<SfAIAssistView @ref="AIAssist"
150+
PromptSuggestions="@suggestions"
151+
PromptRequested="@PromptRequest"
152+
ResponseStopped="@HandleStopResponse">
153+
<AssistViews>
154+
<AssistView>
155+
<BannerTemplate>
156+
<div class="banner-content">
157+
<div class="e-icons e-assistview-icon"></div>
158+
<h3>AI Assistance</h3>
159+
<i> Live responses streamed from your local Ollama model. </i>
160+
</div>
161+
</BannerTemplate>
162+
</AssistView>
163+
</AssistViews>
164+
165+
<AssistViewToolbar ItemClicked="ToolbarItemClicked">
166+
<AssistViewToolbarItem Type="ItemType.Spacer"></AssistViewToolbarItem>
167+
<AssistViewToolbarItem IconCss="e-icons e-refresh"></AssistViewToolbarItem>
168+
</AssistViewToolbar>
169+
</SfAIAssistView>
170+
</div>
171+
</div>
172+
173+
@code {
174+
private SfAIAssistView AIAssist = new();
175+
private bool responseStopped = false;
176+
private bool isStreaming = false;
177+
178+
// Suggestion list
179+
private List<string> suggestions = new()
180+
{
181+
"What are the best tools for organizing my tasks?",
182+
"How can I maintain work-life balance effectively?"
183+
};
184+
185+
[Inject] private IChatClient ChatClient { get; set; } = default!;
186+
187+
private async Task PromptRequest(AssistViewPromptRequestedEventArgs args)
188+
{
189+
responseStopped = false;
190+
isStreaming = true; // turn on Stop button
191+
192+
try
193+
{
194+
var pipeline = new MarkdownPipelineBuilder()
195+
.UseAdvancedExtensions()
196+
.UsePipeTables()
197+
.UseTaskLists()
198+
.Build();
199+
200+
var messages = new List<Microsoft.Extensions.AI.ChatMessage>
201+
{
202+
new(ChatRole.System, "You are a helpful AI assistant. Respond with clear, concise explanations. Use Markdown when helpful."),
203+
new(ChatRole.User, args.Prompt)
204+
};
205+
206+
var buffer = new System.Text.StringBuilder();
207+
const int updateRateChars = 5;
208+
int lastLenPushed = 0;
209+
210+
await foreach (var update in ChatClient.GetStreamingResponseAsync(messages))
211+
{
212+
if (responseStopped) break;
213+
if (string.IsNullOrEmpty(update?.Text)) continue;
214+
215+
buffer.Append(update.Text);
216+
217+
if (buffer.Length - lastLenPushed >= updateRateChars)
218+
{
219+
string html = Markdown.ToHtml(buffer.ToString(), pipeline);
220+
await AIAssist.UpdateResponseAsync(html);
221+
await AIAssist.ScrollToBottomAsync();
222+
lastLenPushed = buffer.Length;
223+
}
224+
}
225+
226+
if (!responseStopped)
227+
{
228+
string finalHtml = Markdown.ToHtml(buffer.ToString(), pipeline);
229+
await AIAssist.UpdateResponseAsync(finalHtml);
230+
await AIAssist.ScrollToBottomAsync();
231+
}
232+
233+
args.PromptSuggestions = suggestions;
234+
}
235+
catch (Exception ex)
236+
{
237+
await AIAssist.UpdateResponseAsync($"Error generating response: {ex.Message}");
238+
await AIAssist.ScrollToBottomAsync();
239+
}
240+
finally
241+
{
242+
responseStopped = false;
243+
isStreaming = true;
244+
StateHasChanged();
245+
}
246+
}
247+
248+
private void ToolbarItemClicked(AssistViewToolbarItemClickedEventArgs args)
249+
{
250+
// Handle Refresh
251+
if (args.Item.IconCss == "e-icons e-refresh")
252+
{
253+
AIAssist.Prompts.Clear();
254+
255+
AIAssist.PromptSuggestions = suggestions;
256+
}
257+
}
258+
259+
private void HandleStopResponse(ResponseStoppedEventArgs args)
260+
{
261+
responseStopped = true;
262+
}
263+
}
264+
265+
<style>
266+
.stream-aiassistview {
267+
height: 350px;
268+
width: 650px;
269+
margin: 0 auto;
270+
}
271+
.stream-aiassistview .banner-content .e-assistview-icon:before {
272+
font-size: 25px;
273+
}
274+
.stream-aiassistview .banner-content {
275+
text-align: center;
276+
}
277+
</style>
278+
279+
{% endhighlight %}
280+
{% endtabs %}
281+
282+
![Blazor AI AssistView LLM Integration](../images/llm-integration.png)

blazor/ai-assistview/ai-integrations/openai-integration.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ Follow the Syncfusion AI AssistView [Getting Started](../getting-started) guide
2929

3030
Install the required packages:
3131

32-
1. Install the `OpenAI` and `Azure` nuget packages in the application.
32+
* Install the `OpenAI` and `Azure` nuget packages in the application.
3333

3434
```bash
3535

@@ -39,7 +39,7 @@ NuGet\Install-Package Azure.Core
3939

4040
```
4141

42-
2. Install the `Markdig` nuget packages in the application.
42+
* Install the `Markdig` nuget packages in the application.
4343

4444
```bash
4545

13.6 KB
Loading
25.1 KB
Loading
21.9 KB
Loading

0 commit comments

Comments
 (0)