Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions conversation/components/ollama.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: ollama
spec:
type: conversation.openai
version: v1
metadata:
- name: key
value: 'ollama'
- name: model
value: gpt-oss:20b
- name: endpoint
value: 'http://localhost:11434/v1' # ollama endpoint https://ollama.com/blog/openai-compatibility
12 changes: 12 additions & 0 deletions conversation/components/openai.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: openai
spec:
type: conversation.openai
version: v1
metadata:
- name: key
value: "YOUR_OPENAI_API_KEY"
- name: model
value: gpt-4o-mini-2024-07-18
54 changes: 54 additions & 0 deletions conversation/python/http/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,34 @@ name: Install Python dependencies

```bash
cd ./conversation
```

<details open="true">
<summary>Option 1: Using venv (Python's built-in virtual environment)</summary>

```bash
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
pip3 install -r requirements.txt
```

</details>

<details>
<summary>Option 2: Using uv (faster alternative to pip)</summary>

```bash
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
# If you don't have uv installed yet, install it first:
# pip install uv
uv pip install -r requirements.txt
```

</details>

```bash
# Return to the parent directory
cd ..
```

Expand Down Expand Up @@ -82,12 +109,39 @@ Open a terminal and run:

```bash
cd ./conversation
```

<details open="true">
<summary>Option 1: Using venv (Python's built-in virtual environment)</summary>

```bash
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
pip3 install -r requirements.txt
```

</details>

<details>
<summary>Option 2: Using uv (faster alternative to pip)</summary>

```bash
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
# If you don't have uv installed yet, install it first:
# pip install uv
uv pip install -r requirements.txt
```

</details>

2. Run the application:

```bash
# Make sure your virtual environment is activated
# If not already activated, run:
# source .venv/bin/activate # On Windows, use: .venv\Scripts\activate

dapr run --app-id conversation --resources-path ../../../components -- python3 app.py
```

Expand Down
262 changes: 168 additions & 94 deletions conversation/python/sdk/README.md
Original file line number Diff line number Diff line change
@@ -1,97 +1,171 @@
# Dapr Conversation API (Python SDK)

In this quickstart, you'll send an input to a mock Large Language Model (LLM) using Dapr's Conversation API. This API is responsible for providing one consistent API entry point to talk to underlying LLM providers.
This quickstart demonstrates how to interact with Large Language Models (LLMs) using Dapr's Conversation API. The Conversation API provides a unified interface for communicating with various LLM providers through a consistent entry point.

Visit [this](https://docs.dapr.io/developing-applications/building-blocks/conversation/conversation-overview/) link for more information about Dapr and the Conversation API.

This quickstart includes one app:

- `app.py`, responsible for sending an input to the underlying LLM and retrieving an output.

## Run the app with the template file

This section shows how to run the application using the [multi-app run template files](https://docs.dapr.io/developing-applications/local-development/multi-app-dapr-run/multi-app-overview/) with `dapr run -f .`.

This example uses the default LLM Component provided by Dapr which simply echoes the input provided, for testing purposes. Here are other [supported Conversation components](https://docs.dapr.io/reference/components-reference/supported-conversation/).

1. Install dependencies:

<!-- STEP
name: Install Python dependencies
-->

```bash
cd ./conversation
pip3 install -r requirements.txt
cd ..
```

<!-- END_STEP -->

2. Open a new terminal window and run the multi app run template:

<!-- STEP
name: Run multi app run template
expected_stdout_lines:
- '== APP - conversation == Input sent: What is dapr?'
- '== APP - conversation == Output response: What is dapr?'
expected_stderr_lines:
output_match_mode: substring
match_order: none
background: true
sleep: 15
timeout_seconds: 30
-->

```bash
dapr run -f .
```

The terminal console output should look similar to this, where:

- The app sends an input `What is dapr?` to the `echo` Component mock LLM.
- The mock LLM echoes `What is dapr?`.

```text
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
```

<!-- END_STEP -->

3. Stop and clean up application processes.

<!-- STEP
name: Stop multi-app run
sleep: 5
-->

```bash
dapr stop -f .
```

<!-- END_STEP -->

## Run the app with the Dapr CLI

1. Install dependencies:

Open a terminal and run:

```bash
cd ./conversation
pip3 install -r requirements.txt
```

2. Run the application:

```bash
dapr run --app-id conversation --resources-path ../../../components -- python3 app.py
```

You should see the output:

```bash
== APP == Input sent: What is dapr?
== APP == Output response: What is dapr?
```
For comprehensive documentation on Dapr's Conversation API, see the [official documentation](https://docs.dapr.io/developing-applications/building-blocks/conversation/conversation-overview/).

## Sample Applications

This quickstart includes three example applications:

- `app.py`: Basic example that sends a prompt to an LLM and retrieves the response
- `tool_calling.py`: Advanced example that defines a tool and sends a request to an LLM that supports tool calling
- `tool_calling_from_function.py`: Similar to `tool_calling.py` but uses a helper function to generate the JSON schema for function calling

## LLM Providers

By default, this quickstart uses Dapr's mock LLM Echo Component, which simply echoes back the input for testing purposes.

The repository also includes pre-configured components for the following LLM providers:
- [OpenAI](../../components/openai.yaml)
- [Ollama](../../components/ollama.yaml) (via its OpenAI compatibility layer)

To use one of these alternative provider, modify the `provider_component` value in your application code from `echo` to either `openai` or `ollama`.

Of course, you can also play adding components for other LLM providers supported by Dapr.

### OpenAI Configuration

To use the OpenAI provider:

1. Change the `provider_component` parameter in your application code to `openai`
2. Edit the [openai.yaml](../../components/openai.yaml) component file and replace `YOUR_OPENAI_API_KEY` with your actual OpenAI API key

### Ollama Configuration

To use the Ollama provider:

1. Change the `provider_component` parameter in your application code to `ollama`
2. Install and run Ollama locally on your machine
3. Pull a model with tool-calling support from the [Ollama models repository](https://ollama.com/search?c=tools)

The default configuration uses the `gpt-oss:20b` model, but you can modify the component file to use any compatible model that your system can run.

## Running the Application

You can run the sample applications using either the Dapr multi-app template or the Dapr CLI directly.

### Option 1: Using the Multi-App Template

This approach uses [Dapr's multi-app run template files](https://docs.dapr.io/developing-applications/local-development/multi-app-dapr-run/multi-app-overview/) to simplify deployment with `dapr run -f .`.

For more LLM options, see the [supported Conversation components](https://docs.dapr.io/reference/components-reference/supported-conversation/) documentation.

1. **Install dependencies:**

<!-- STEP
name: Install Python dependencies
-->

```bash
cd ./conversation
```

<details open="true">
<summary>Option 1: Using pip</summary>

```bash
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
pip3 install -r requirements.txt
```

</details>

<details>
<summary>Option 2: Using uv (faster alternative to pip)</summary>

```bash
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
# If you do not have uv installed yet, install it first:
# pip install uv
uv pip install -r requirements.txt
```

</details>

```bash
# Return to the parent directory
cd ..
```
<!-- END_STEP -->

2. **Run the application:**

<!-- STEP
name: Run multi app run template
expected_stdout_lines:
- '== APP - conversation == Input sent: What is dapr?'
- '== APP - conversation == Output response: What is dapr?'
expected_stderr_lines:
output_match_mode: substring
match_order: none
background: true
sleep: 15
timeout_seconds: 30
-->

```bash
dapr run -f .
```

Expected output:

```text
== APP - conversation == Input sent: What is dapr?
== APP - conversation == Output response: What is dapr?
```

<!-- END_STEP -->

3. **Stop the application:**

<!-- STEP
name: Stop multi-app run
sleep: 5
-->

```bash
dapr stop -f .
```

<!-- END_STEP -->

### Option 2: Using the Dapr CLI Directly

As an alternative to the multi-app template, you can run the application directly with the Dapr CLI.

1. **Install dependencies:**

```bash
cd ./conversation
python3 -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
pip3 install -r requirements.txt
```

2. **Run the application:**

```bash
dapr run --app-id conversation --resources-path ../../../components -- python3 app.py
```

Expected output:

```text
== APP == Input sent: What is dapr?
== APP == Output response: What is dapr?
```

3. **Try the tool calling examples:**

You can run the other example applications similarly:

```bash
# For tool calling example
dapr run --app-id conversation --resources-path ../../../components -- python3 tool_calling.py
# For tool calling with function helper example
dapr run --app-id conversation --resources-path ../../../components -- python3 tool_calling_from_function.py
```
Loading