-
Notifications
You must be signed in to change notification settings - Fork 548
update for Dapr 1.16 conversation SDK Python quickstart #1215
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
alicejgibbons
merged 10 commits into
dapr:release-1.16
from
filintod:filinto/conversation-sdk-116
Sep 15, 2025
Merged
Changes from 1 commit
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
48cf96a
add alpha2 examples to conversation sdk
filintod f9371fb
Merge branch 'release-1.16' into filinto/conversation-sdk-116
filintod 7cceaba
remove real llm provider info
filintod 217d4ce
remove real llm provider info
filintod d7cf203
Merge branch 'release-1.16' into filinto/conversation-sdk-116
filintod 2e19dde
updates, merge from release-1.16, fix tests, separate app IDs
filintod 3d68ef4
Merge branch 'release-1.16' into filinto/conversation-sdk-116
filintod a5845de
updated from feedback
filintod f6366a5
update requirement to rc2
filintod e492101
Merge branch 'release-1.16' into filinto/conversation-sdk-116
alicejgibbons File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,14 @@ | ||
| apiVersion: dapr.io/v1alpha1 | ||
| kind: Component | ||
| metadata: | ||
| name: ollama | ||
| spec: | ||
| type: conversation.openai | ||
| version: v1 | ||
| metadata: | ||
| - name: key | ||
| value: 'ollama' | ||
| - name: model | ||
| value: gpt-oss:20b | ||
| - name: endpoint | ||
| value: 'http://localhost:11434/v1' # ollama endpoint https://ollama.com/blog/openai-compatibility |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,12 @@ | ||
| apiVersion: dapr.io/v1alpha1 | ||
| kind: Component | ||
| metadata: | ||
| name: openai | ||
| spec: | ||
| type: conversation.openai | ||
| version: v1 | ||
| metadata: | ||
| - name: key | ||
| value: "YOUR_OPENAI_API_KEY" | ||
| - name: model | ||
| value: gpt-4o-mini-2024-07-18 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,97 +1,171 @@ | ||
| # Dapr Conversation API (Python SDK) | ||
|
|
||
| In this quickstart, you'll send an input to a mock Large Language Model (LLM) using Dapr's Conversation API. This API is responsible for providing one consistent API entry point to talk to underlying LLM providers. | ||
| This quickstart demonstrates how to interact with Large Language Models (LLMs) using Dapr's Conversation API. The Conversation API provides a unified interface for communicating with various LLM providers through a consistent entry point. | ||
|
|
||
| Visit [this](https://docs.dapr.io/developing-applications/building-blocks/conversation/conversation-overview/) link for more information about Dapr and the Conversation API. | ||
|
|
||
| This quickstart includes one app: | ||
|
|
||
| - `app.py`, responsible for sending an input to the underlying LLM and retrieving an output. | ||
|
|
||
| ## Run the app with the template file | ||
|
|
||
| This section shows how to run the application using the [multi-app run template files](https://docs.dapr.io/developing-applications/local-development/multi-app-dapr-run/multi-app-overview/) with `dapr run -f .`. | ||
|
|
||
| This example uses the default LLM Component provided by Dapr which simply echoes the input provided, for testing purposes. Here are other [supported Conversation components](https://docs.dapr.io/reference/components-reference/supported-conversation/). | ||
|
|
||
| 1. Install dependencies: | ||
|
|
||
| <!-- STEP | ||
| name: Install Python dependencies | ||
| --> | ||
|
|
||
| ```bash | ||
| cd ./conversation | ||
| pip3 install -r requirements.txt | ||
| cd .. | ||
| ``` | ||
|
|
||
| <!-- END_STEP --> | ||
|
|
||
| 2. Open a new terminal window and run the multi app run template: | ||
|
|
||
| <!-- STEP | ||
alicejgibbons marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| name: Run multi app run template | ||
| expected_stdout_lines: | ||
| - '== APP - conversation == Input sent: What is dapr?' | ||
| - '== APP - conversation == Output response: What is dapr?' | ||
| expected_stderr_lines: | ||
| output_match_mode: substring | ||
| match_order: none | ||
| background: true | ||
| sleep: 15 | ||
| timeout_seconds: 30 | ||
| --> | ||
|
|
||
| ```bash | ||
| dapr run -f . | ||
| ``` | ||
|
|
||
| The terminal console output should look similar to this, where: | ||
|
|
||
| - The app sends an input `What is dapr?` to the `echo` Component mock LLM. | ||
| - The mock LLM echoes `What is dapr?`. | ||
|
|
||
| ```text | ||
| == APP - conversation == Input sent: What is dapr? | ||
| == APP - conversation == Output response: What is dapr? | ||
| ``` | ||
|
|
||
| <!-- END_STEP --> | ||
|
|
||
| 3. Stop and clean up application processes. | ||
|
|
||
| <!-- STEP | ||
| name: Stop multi-app run | ||
| sleep: 5 | ||
| --> | ||
|
|
||
| ```bash | ||
| dapr stop -f . | ||
| ``` | ||
|
|
||
| <!-- END_STEP --> | ||
|
|
||
| ## Run the app with the Dapr CLI | ||
|
|
||
| 1. Install dependencies: | ||
|
|
||
| Open a terminal and run: | ||
|
|
||
| ```bash | ||
| cd ./conversation | ||
| pip3 install -r requirements.txt | ||
| ``` | ||
|
|
||
| 2. Run the application: | ||
|
|
||
| ```bash | ||
| dapr run --app-id conversation --resources-path ../../../components -- python3 app.py | ||
| ``` | ||
|
|
||
| You should see the output: | ||
|
|
||
| ```bash | ||
| == APP == Input sent: What is dapr? | ||
| == APP == Output response: What is dapr? | ||
| ``` | ||
| For comprehensive documentation on Dapr's Conversation API, see the [official documentation](https://docs.dapr.io/developing-applications/building-blocks/conversation/conversation-overview/). | ||
|
|
||
| ## Sample Applications | ||
|
|
||
| This quickstart includes three example applications: | ||
|
|
||
| - `app.py`: Basic example that sends a prompt to an LLM and retrieves the response | ||
| - `tool_calling.py`: Advanced example that defines a tool and sends a request to an LLM that supports tool calling | ||
| - `tool_calling_from_function.py`: Similar to `tool_calling.py` but uses a helper function to generate the JSON schema for function calling | ||
|
|
||
| ## LLM Providers | ||
|
|
||
| By default, this quickstart uses Dapr's mock LLM Echo Component, which simply echoes back the input for testing purposes. | ||
|
|
||
| The repository also includes pre-configured components for the following LLM providers: | ||
| - [OpenAI](../../components/openai.yaml) | ||
| - [Ollama](../../components/ollama.yaml) (via its OpenAI compatibility layer) | ||
|
|
||
| To use one of these alternative provider, modify the `provider_component` value in your application code from `echo` to either `openai` or `ollama`. | ||
|
|
||
| Of course, you can also play adding components for other LLM providers supported by Dapr. | ||
|
|
||
| ### OpenAI Configuration | ||
|
|
||
| To use the OpenAI provider: | ||
|
|
||
| 1. Change the `provider_component` parameter in your application code to `openai` | ||
| 2. Edit the [openai.yaml](../../components/openai.yaml) component file and replace `YOUR_OPENAI_API_KEY` with your actual OpenAI API key | ||
|
|
||
| ### Ollama Configuration | ||
|
|
||
| To use the Ollama provider: | ||
|
|
||
| 1. Change the `provider_component` parameter in your application code to `ollama` | ||
| 2. Install and run Ollama locally on your machine | ||
| 3. Pull a model with tool-calling support from the [Ollama models repository](https://ollama.com/search?c=tools) | ||
|
|
||
| The default configuration uses the `gpt-oss:20b` model, but you can modify the component file to use any compatible model that your system can run. | ||
|
|
||
| ## Running the Application | ||
|
|
||
| You can run the sample applications using either the Dapr multi-app template or the Dapr CLI directly. | ||
|
|
||
| ### Option 1: Using the Multi-App Template | ||
filintod marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| This approach uses [Dapr's multi-app run template files](https://docs.dapr.io/developing-applications/local-development/multi-app-dapr-run/multi-app-overview/) to simplify deployment with `dapr run -f .`. | ||
|
|
||
| For more LLM options, see the [supported Conversation components](https://docs.dapr.io/reference/components-reference/supported-conversation/) documentation. | ||
|
|
||
| 1. **Install dependencies:** | ||
|
|
||
| <!-- STEP | ||
| name: Install Python dependencies | ||
| --> | ||
|
|
||
| ```bash | ||
| cd ./conversation | ||
| ``` | ||
|
|
||
| <details open="true"> | ||
| <summary>Option 1: Using pip</summary> | ||
|
|
||
| ```bash | ||
| python3 -m venv .venv | ||
| source .venv/bin/activate # On Windows, use: .venv\Scripts\activate | ||
| pip3 install -r requirements.txt | ||
| ``` | ||
|
|
||
| </details> | ||
|
|
||
| <details> | ||
| <summary>Option 2: Using uv (faster alternative to pip)</summary> | ||
|
|
||
| ```bash | ||
| python3 -m venv .venv | ||
| source .venv/bin/activate # On Windows, use: .venv\Scripts\activate | ||
| # If you do not have uv installed yet, install it first: | ||
| # pip install uv | ||
| uv pip install -r requirements.txt | ||
| ``` | ||
|
|
||
| </details> | ||
|
|
||
| ```bash | ||
| # Return to the parent directory | ||
| cd .. | ||
| ``` | ||
| <!-- END_STEP --> | ||
|
|
||
| 2. **Run the application:** | ||
|
|
||
| <!-- STEP | ||
| name: Run multi app run template | ||
| expected_stdout_lines: | ||
| - '== APP - conversation == Input sent: What is dapr?' | ||
| - '== APP - conversation == Output response: What is dapr?' | ||
| expected_stderr_lines: | ||
| output_match_mode: substring | ||
| match_order: none | ||
| background: true | ||
| sleep: 15 | ||
| timeout_seconds: 30 | ||
| --> | ||
|
|
||
| ```bash | ||
| dapr run -f . | ||
| ``` | ||
|
|
||
| Expected output: | ||
|
|
||
| ```text | ||
| == APP - conversation == Input sent: What is dapr? | ||
| == APP - conversation == Output response: What is dapr? | ||
| ``` | ||
|
|
||
| <!-- END_STEP --> | ||
|
|
||
| 3. **Stop the application:** | ||
|
|
||
| <!-- STEP | ||
| name: Stop multi-app run | ||
| sleep: 5 | ||
| --> | ||
|
|
||
| ```bash | ||
| dapr stop -f . | ||
| ``` | ||
|
|
||
| <!-- END_STEP --> | ||
|
|
||
| ### Option 2: Using the Dapr CLI Directly | ||
filintod marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| As an alternative to the multi-app template, you can run the application directly with the Dapr CLI. | ||
|
|
||
| 1. **Install dependencies:** | ||
|
|
||
| ```bash | ||
| cd ./conversation | ||
| python3 -m venv .venv | ||
| source .venv/bin/activate # On Windows, use: .venv\Scripts\activate | ||
| pip3 install -r requirements.txt | ||
| ``` | ||
|
|
||
| 2. **Run the application:** | ||
|
|
||
| ```bash | ||
| dapr run --app-id conversation --resources-path ../../../components -- python3 app.py | ||
| ``` | ||
|
|
||
| Expected output: | ||
|
|
||
| ```text | ||
| == APP == Input sent: What is dapr? | ||
| == APP == Output response: What is dapr? | ||
| ``` | ||
|
|
||
| 3. **Try the tool calling examples:** | ||
|
|
||
| You can run the other example applications similarly: | ||
|
|
||
| ```bash | ||
| # For tool calling example | ||
| dapr run --app-id conversation --resources-path ../../../components -- python3 tool_calling.py | ||
| # For tool calling with function helper example | ||
| dapr run --app-id conversation --resources-path ../../../components -- python3 tool_calling_from_function.py | ||
| ``` | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.