Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ Here are the preferred tools for development.
| [8_streamlit_azure_openai_batch](./apps/8_streamlit_azure_openai_batch/README.md) | Call Azure OpenAI Batch API with Streamlit | ![8_streamlit_azure_openai_batch](./docs/images/8_streamlit_azure_openai_batch.main.png) |
| [9_streamlit_azure_document_intelligence](./apps/9_streamlit_azure_document_intelligence/README.md) | Call Azure AI Document Intelligence API with Streamlit | ![9_streamlit_azure_document_intelligence](./docs/images/9_streamlit_azure_document_intelligence.main.png) |
| [10_streamlit_batch_transcription](./apps/10_streamlit_batch_transcription/README.md) | Call Batch Transcription API with Streamlit | ![10_streamlit_batch_transcription](./docs/images/10_streamlit_batch_transcription.main.png) |
| [11_promptflow](./apps/11_promptflow/README.md) | Get started with Prompt flow | No Image |
| [99_streamlit_examples](./apps/99_streamlit_examples/README.md) | Code samples for Streamlit | ![99_streamlit_examples](./docs/images/99_streamlit_examples.explaindata.png) |

## How to run
Expand Down
1 change: 1 addition & 0 deletions apps/11_promptflow/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
**/.promptflow/*
183 changes: 183 additions & 0 deletions apps/11_promptflow/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,183 @@
# Getting Started with Prompt flow

This application explains how to get started with [Prompt flow](https://github.com/microsoft/promptflow), a Python library that provides a simple and easy way to build conversational AI applications.

## Prerequisites

- Python 3.10 or later
- Azure OpenAI Service

## Overview

Prompt flow is a suite of development tools designed to streamline the end-to-end development cycle of LLM-based AI applications, from ideation, prototyping, testing, evaluation to production deployment and monitoring. It makes prompt engineering much easier and enables you to build LLM apps with production quality.

## Usage

1. Get the API key for Azure OpenAI Service
1. Copy [.env.template](../../.env.template) to `.env` in the same directory
1. Set credentials in `.env`
1. Run scripts in the [apps/11_promptflow](./) directory

Set up the environment and install dependencies:

```shell
# Create a virtual environment
$ python -m venv .venv

# Activate the virtual environment
$ source .venv/bin/activate

# Install dependencies
$ pip install -r requirements.txt
```

## Examples

[Prompt flow > Quick start](https://microsoft.github.io/promptflow/how-to-guides/quick-start.html) provides a quick start guide to Prompt flow.

### [chat_minimal](https://github.com/microsoft/promptflow/tree/main/examples/flex-flows/chat-minimal)

**Run as normal Python script**

```shell
$ python apps/11_promptflow/chat_minimal/main.py
```

**Run from CLI**

```shell
$ cd apps/11_promptflow/chat_minimal

# Test flow
$ pf flow test \
--flow main:chat \
--inputs question="What's the capital of France?"

# Test flow: multi turn, access to http://localhost:{EPHEMERAL_PORT}
$ pf flow test \
--flow main:chat \
--ui

# Create run with multiple lines data
$ pf run create \
--flow main:chat \
--data ./data.jsonl \
--column-mapping question='${data.question}' \
--stream
```

### playground_chat

```shell
cd apps/11_promptflow

# Initialize a new flow
$ pf flow init \
--flow playground_chat \
--type chat

$ cd playground_chat

# Set parameters
$ CONNECTION_NAME=open_ai_connection
$ AZURE_OPENAI_KEY=<your_api_key>
$ AZURE_OPENAI_ENDPOINT=<your_api_endpoint>

# List connections
$ pf connection list


# Delete connection (if needed)
$ pf connection delete \
--name $CONNECTION_NAME

# Create connection
$ pf connection create \
--file azure_openai.yaml \
--set api_key=$AZURE_OPENAI_KEY \
--set api_base=$AZURE_OPENAI_ENDPOINT \
--name $CONNECTION_NAME

# Show connection
$ pf connection show \
--name $CONNECTION_NAME

# Interact with chat flow
$ pf flow test \
--flow . \
--interactive

# Test flow
$ pf flow test \
--flow . \
--inputs question="What's the capital of France?"

# Create run with multiple lines data
$ RUN_NAME=playground_chat-$(date +%s)
$ pf run create \
--name $RUN_NAME \
--flow . \
--data ./data.jsonl \
--column-mapping question='${data.question}' \
--stream

# Show run details
$ pf run show-details --name $RUN_NAME
```

### playground_evaluation

```shell
cd apps/11_promptflow

# Initialize a new flow
$ pf flow init \
--flow playground_evaluation \
--type evaluation

$ cd playground_evaluation

# Create run with multiple lines data
$ RUN_NAME=playground_evaluation-$(date +%s)
$ pf run create \
--name $RUN_NAME \
--flow . \
--data ./data.jsonl \
--column-mapping \
groundtruth='${data.groundtruth}' \
prediction='${data.prediction}' \
--stream

# Show run details
$ pf run show-details --name $RUN_NAME
```

### playground_standard

```shell
cd apps/11_promptflow

# Initialize a new flow
$ pf flow init \
--flow playground_standard \
--type standard

$ cd playground_standard

# Create run with multiple lines data
$ RUN_NAME=playground_standard-$(date +%s)
$ pf run create \
--name $RUN_NAME \
--flow . \
--data ./data.jsonl \
--column-mapping text='${data.text}' \
--stream

# Show run details
$ pf run show-details --name $RUN_NAME
```

## References

- [Prompt flow > repos](https://github.com/microsoft/promptflow)
- [Prompt flow > documents](https://microsoft.github.io/promptflow/)
22 changes: 22 additions & 0 deletions apps/11_promptflow/chat_minimal/chat.prompty
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
---
name: Minimal Chat
model:
api: chat
configuration:
type: azure_openai
azure_deployment: gpt-4o
parameters:
temperature: 0.2
max_tokens: 1024
inputs:
question:
type: string
sample:
question: "What is Prompt flow?"
---

system:
You are a helpful assistant.

user:
{{question}}
2 changes: 2 additions & 0 deletions apps/11_promptflow/chat_minimal/data.jsonl
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
{"question": "What is Prompt flow?"}
{"question": "What is ChatGPT? Please explain with consise statement"}
21 changes: 21 additions & 0 deletions apps/11_promptflow/chat_minimal/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
from pathlib import Path

from dotenv import load_dotenv
from promptflow.core import Prompty
from promptflow.tracing import start_trace, trace

BASE_DIR = Path(__file__).absolute().parent


@trace
def chat(question: str) -> str:
load_dotenv()
prompty = Prompty.load(source=f"{BASE_DIR}/chat.prompty")
return prompty(question=question)


if __name__ == "__main__":
start_trace()

result = chat("What's the capital of France?")
print(result)
5 changes: 5 additions & 0 deletions apps/11_promptflow/playground_chat/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
.env
__pycache__/
.promptflow/*
!.promptflow/flow.tools.json
.runs/
21 changes: 21 additions & 0 deletions apps/11_promptflow/playground_chat/.promptflow/flow.tools.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"package": {},
"code": {
"chat.jinja2": {
"type": "llm",
"inputs": {
"question": {
"type": [
"string"
]
},
"chat_history": {
"type": [
"string"
]
}
},
"description": "Chat with Chatbot"
}
}
}
86 changes: 86 additions & 0 deletions apps/11_promptflow/playground_chat/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# Chat flow
Chat flow is designed for conversational application development, building upon the capabilities of standard flow and providing enhanced support for chat inputs/outputs and chat history management. With chat flow, you can easily create a chatbot that handles chat input and output.

## Create connection for LLM tool to use
You can follow these steps to create a connection required by a LLM tool.

Currently, there are two connection types supported by LLM tool: "AzureOpenAI" and "OpenAI". If you want to use "AzureOpenAI" connection type, you need to create an Azure OpenAI service first. Please refer to [Azure OpenAI Service](https://azure.microsoft.com/en-us/products/cognitive-services/openai-service/) for more details. If you want to use "OpenAI" connection type, you need to create an OpenAI account first. Please refer to [OpenAI](https://platform.openai.com/) for more details.

```bash
# Override keys with --set to avoid yaml file changes
# Create OpenAI connection
pf connection create --file openai.yaml --set api_key=<your_api_key> --name open_ai_connection

# Create azure OpenAI connection
# pf connection create --file azure_openai.yaml --set api_key=<your_api_key> api_base=<your_api_base> --name open_ai_connection
```

Note in [flow.dag.yaml](flow.dag.yaml) we are using connection named `open_ai_connection`.
```bash
# show registered connection
pf connection show --name open_ai_connection
```
Please refer to connections [document](https://promptflow.azurewebsites.net/community/local/manage-connections.html) and [example](https://github.com/microsoft/promptflow/tree/main/examples/connections) for more details.

## Develop a chat flow

The most important elements that differentiate a chat flow from a standard flow are **Chat Input**, **Chat History**, and **Chat Output**.

- **Chat Input**: Chat input refers to the messages or queries submitted by users to the chatbot. Effectively handling chat input is crucial for a successful conversation, as it involves understanding user intentions, extracting relevant information, and triggering appropriate responses.

- **Chat History**: Chat history is the record of all interactions between the user and the chatbot, including both user inputs and AI-generated outputs. Maintaining chat history is essential for keeping track of the conversation context and ensuring the AI can generate contextually relevant responses. Chat History is a special type of chat flow input, that stores chat messages in a structured format.

- **Chat Output**: Chat output refers to the AI-generated messages that are sent to the user in response to their inputs. Generating contextually appropriate and engaging chat outputs is vital for a positive user experience.

A chat flow can have multiple inputs, but Chat History and Chat Input are required inputs in chat flow.

## Interact with chat flow

Promptflow CLI provides a way to start an interactive chat session for chat flow. Customer can use below command to start an interactive chat session:

```
pf flow test --flow <flow_folder> --interactive
```

After executing this command, customer can interact with the chat flow in the terminal. Customer can press **Enter** to send the message to chat flow. And customer can quit with **ctrl+C**.
Promptflow CLI will distinguish the output of different roles by color, <span style="color:Green">User input</span>, <span style="color:Gold">Bot output</span>, <span style="color:Blue">Flow script output</span>, <span style="color:Cyan">Node output</span>.

> =========================================<br>
> Welcome to chat flow, <You-flow-name>.<br>
> Press Enter to send your message.<br>
> You can quit with ctrl+C.<br>
> =========================================<br>
> <span style="color:Green">User:</span> What types of container software there are<br>
> <span style="color:Gold">Bot:</span> There are several types of container software available, including:<br>
> 1. Docker: This is one of the most popular containerization software that allows developers to package their applications into containers and deploy them across different environments.<br>
> 2. Kubernetes: This is an open-source container orchestration platform that automates the deployment, scaling, and management of containerized applications.<br>
>
> <span style="color:Green">User:</span> What's the different between them<br>
> <span style="color:Gold">Bot:</span> The main difference between the various container software systems is their functionality and purpose. Here are some key differences between them:<br>
> 1. Docker is more focused on container packaging and deployment, while Kubernetes is more focused on container orchestration and management.<br>
> 2. Kubernetes: Kubernetes is a container orchestration tool that helps manage and deploy containers at scale. It automates the deployment, scaling, and management of containerized applications across multiple hosts.<br>

If customer adds "--verbose" in the pf command, the output of each step will be displayed.

> =========================================<br>
> Welcome to chat flow, Template Chat Flow.<br>
> Press Enter to send your message.<br>
> You can quit with ctrl+C.<br>
> =========================================<br>
> <span style="color:Green">User:</span> What types of container software there are<br>
> <span style="color:Cyan">chat:</span> There are several types of container software available, including:<br>
> 1. Docker: A popular container platform that is widely used in the industry.<br>
> 2. Kubernetes: A container orchestration tool that helps manage and deploy containers at scale.<br>
>
> <span style="color:Gold">Bot:</span> There are several types of container software available, including:<br>
> 1. Docker: A popular container platform that is widely used in the industry.<br>
> 2. Kubernetes: A container orchestration tool that helps manage and deploy containers at scale.<br>
>
> <span style="color:Green">User:</span> What's the different between them<br>
> <span style="color:Cyan">chat:</span> The main differences between container software are in their architecture, feature sets, and use cases. Here are some brief explanations of the differences between the examples I listed:<br>
> 1. Docker: Docker is a container platform that is widely used for building, packaging, and deploying containerized applications. It is known for its ease of use, portability, and large ecosystem of tools and services.<br>
> 2. Kubernetes: Kubernetes is a container orchestration tool that helps manage and deploy containers at scale. It automates the deployment, scaling, and management of containerized applications across multiple hosts.<br>
>
> <span style="color:Gold">Bot:</span> The main differences between container software are in their architecture, feature sets, and use cases. Here are some brief explanations of the differences between the examples I listed:<br>
> 1. Docker: Docker is a container platform that is widely used for building, packaging, and deploying containerized applications. It is known for its ease of use, portability, and large ecosystem of tools and services.<br>
> 2. Kubernetes: Kubernetes is a container orchestration tool that helps manage and deploy containers at scale. It automates the deployment, scaling, and management of containerized applications across multiple hosts.<br>
7 changes: 7 additions & 0 deletions apps/11_promptflow/playground_chat/azure_openai.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/AzureOpenAIConnection.schema.json
name: open_ai_connection
type: azure_open_ai
api_key: "<user-input>"
api_base: "<user-input>"
api_type: "azure"
api_version: "2024-07-01-preview"
12 changes: 12 additions & 0 deletions apps/11_promptflow/playground_chat/chat.jinja2
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
system:
You are a helpful assistant.

{% for item in chat_history %}
user:
{{item.inputs.question}}
assistant:
{{item.outputs.answer}}
{% endfor %}

user:
{{question}}
3 changes: 3 additions & 0 deletions apps/11_promptflow/playground_chat/data.jsonl
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{"question": "What's the capital of France?"}
{"question": "What's the capital of Japan?"}
{"question": "What's the capital of China?"}
Loading