Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 62 additions & 2 deletions apps/1_call_azure_openai_chat/README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,63 @@
# Call Azure OpenAI Service API from Python

This article explains how to call Azure OpenAI Service API from Python.
This app demonstrates how to call the Azure OpenAI Service API from Python.

## Prerequisites

- Python 3.10 or later
- Azure OpenAI Service

## Overview

To call Azure OpenAI Service API, you can send HTTP requests directly to the API endpoint or use the [OpenAI Python API library](https://pypi.org/project/openai/).

**Send HTTP requests directly to the API endpoint**

```shell
YOUR_AOAI_NAME="your-aoai-name"
YOUR_DEPLOYMENT_ID="your-deployment-id"
YOUR_API_KEY="your-api-key"

curl -X 'POST' \
"https://$YOUR_AOAI_NAME.openai.azure.com/openai/deployments/$YOUR_DEPLOYMENT_ID/chat/completions?api-version=2023-12-01-preview" \
-H "api-key: $YOUR_API_KEY" \
-H 'Content-Type: application/json' \
-d '{
"messages": [
{"role": "user", "content": "What is the weather like in Boston and New York?"}
]
}'
```

**Use OpenAI Python API library**

```python
# Import modules
from os import getenv
from openai import AzureOpenAI

# Initialize AzureOpenAI client
client = AzureOpenAI(
api_key=getenv("AZURE_OPENAI_API_KEY"),
api_version=getenv("AZURE_OPENAI_API_VERSION"),
azure_endpoint=getenv("AZURE_OPENAI_ENDPOINT"),
)

# Call completion API and get a response to user input
response = client.chat.completions.create(
model=getenv("AZURE_OPENAI_GPT_MODEL"),
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello"},
],
)
```

For more information, see the following references.

- API Reference: [Azure OpenAI Service REST API reference](https://learn.microsoft.com/en-us/azure/ai-services/openai/reference)
- OpenAPI Spec: [Cognitive Services AzureOpenAI SDKs](https://github.com/Azure/azure-rest-api-specs/tree/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference).

## Usage

1. Get Azure OpenAI Service API key
Expand All @@ -30,6 +81,10 @@ $ python apps/1_call_azure_openai/main.py

### Example

To call the Azure OpenAI Service API, run the following command.

Detailed information is described in the [Quickstart: Get started using GPT-35-Turbo and GPT-4 with Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart?tabs=command-line%2Cpython-new&pivots=programming-language-python).

```shell
$ python apps/1_call_azure_openai_chat/main.py
{
Expand Down Expand Up @@ -115,4 +170,9 @@ $ python apps/1_call_azure_openai_chat/main.py

## References

- [Quickstart: Get started using GPT-35-Turbo and GPT-4 with Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart?tabs=command-line%2Cpython-new&pivots=programming-language-python)
- Python basics
- [Python Cheatsheet > Basics](https://www.pythoncheatsheet.org/cheatsheet/basics)
- [venv — Creation of virtual environments](https://docs.python.org/3/library/venv.html#creating-virtual-environments)
- Azure OpenAI Basics
- [Azure OpenAI Service documentation](https://learn.microsoft.com/azure/ai-services/openai/)
- [Quickstart: Get started generating text using Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/quickstart?tabs=command-line%2Cpython-new&pivots=programming-language-python)
80 changes: 76 additions & 4 deletions apps/2_streamlit_chat/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,82 @@ This app demonstrates how to create a chat application using Azure OpenAI Servic
- Python 3.10 or later
- Azure OpenAI Service

## Overview

In this app, you learn how to create an app with [Streamlit](https://streamlit.io/).

Streamlit is an open-source Python library that makes it easy to create and share beautiful, custom web apps for machine learning and data science projects. It allows you to build interactive applications with minimal code (only in Python!!), focusing on rapid prototyping and simplicity.

To get started, it is recommended to go through the [Streamlit documentation](https://docs.streamlit.io/get-started/installation/command-line) to create your first hello world app.

```shell
# Go to some directory
cd tmp

# Create a virtual environment
python -m venv .venv

# Activate the virtual environment
source .venv/bin/activate

# Install Streamlit
pip install streamlit

# Run the hello world python script from local file, described below
streamlit run ./hello.py
```

**Create a "Hello World" app and run it**

```python
import streamlit as st

st.write("Hello world")
```

To go further, you can refer to the [API reference](https://docs.streamlit.io/develop/api-reference) to understand the different components and functionalities available in Streamlit. You can also refer to the [Streamlit Cheat Sheet](https://docs.streamlit.io/develop/quick-reference/cheat-sheet). There are many examples and tutorials available on the Streamlit website.

**Create a chat application**

For implementing the chat application, you can refer to the [Streamlit Chat](https://streamlit.io/generative-ai) example below.
This app uses the OpenAI API to create a chatbot that can chat with users. The chatbot uses the GPT-3.5-turbo model to generate responses.

```python
import streamlit as st
from openai import OpenAI

with st.sidebar:
openai_api_key = st.text_input("OpenAI API Key", key="chatbot_api_key", type="password")
"[Get an OpenAI API key](https://platform.openai.com/account/api-keys)"
"[View the source code](https://github.com/streamlit/llm-examples/blob/main/Chatbot.py)"
"[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/streamlit/llm-examples?quickstart=1)"

st.title("💬 Chatbot")

if "messages" not in st.session_state:
st.session_state["messages"] = [{"role": "assistant", "content": "How can I help you?"}]

for msg in st.session_state.messages:
st.chat_message(msg["role"]).write(msg["content"])

if prompt := st.chat_input():
if not openai_api_key:
st.info("Please add your OpenAI API key to continue.")
st.stop()

client = OpenAI(api_key=openai_api_key)
st.session_state.messages.append({"role": "user", "content": prompt})
st.chat_message("user").write(prompt)
response = client.chat.completions.create(model="gpt-3.5-turbo", messages=st.session_state.messages)
msg = response.choices[0].message.content
st.session_state.messages.append({"role": "assistant", "content": msg})
st.chat_message("assistant").write(msg)
```

### **_Exercise: Convert OpenAI API to Azure OpenAI Service_**

In this exercise, please convert the OpenAI API to the Azure OpenAI Service API with the knowledge you have gained from the previous chapter [1_call_azure_openai_chat](../1_call_azure_openai_chat/README.md).

## Usage

1. Get Azure OpenAI Service API key
Expand Down Expand Up @@ -41,7 +117,3 @@ To start a conversation, fill in the required fields in the sidebar and you will
## Note

This app uses `st.session_state.messages` to store messages for chat. This is a mechanism to store messages per session on the process side of the application. Messages will disappear when the session ends.

## References

- [Your LLM code playground](https://streamlit.io/generative-ai)
3 changes: 3 additions & 0 deletions apps/2_streamlit_chat/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
import streamlit as st

st.write("Hello world")
60 changes: 52 additions & 8 deletions apps/6_call_azure_ai_search/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,54 @@ This application explains how to call Azure AI Search from Python.
- Azure AI Search
- Azure OpenAI Service

## Overview

Azure AI Search (formerly known as Azure Cognitive Search) is a fully managed cloud search service that provides information retrieval over user-owned content.
Data plane REST APIs are used for indexing and query workflows, and are documented in this section.
[Azure AI Search REST API reference](https://learn.microsoft.com/rest/api/searchservice/?view=rest-searchservice-2023-11-01) provides detailed information about the APIs.

REST API specs in OpenAPI format are available in the [Azure/azure-rest-api-specs](https://github.com/Azure/azure-rest-api-specs/tree/main/specification/search/data-plane/Azure.Search) repository.

[Samples for Azure Cognitive Search client library for Python](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/search/azure-search-documents/samples) are available in the Azure SDK for Python repository.

[Azure AI Search client library for Python - version 11.5.1](https://learn.microsoft.com/en-us/python/api/overview/azure/search-documents-readme?view=azure-python) provides primitive APIs for working with Azure AI Search. It is flexible and allows you to work with the service at a lower level.

**Introducing LangChain**

[LangChain](https://github.com/langchain-ai/langchain) is a framework for developing applications powered by large language models (LLMs).
It provides a set of tools and libraries to help you build, train, and deploy LLMs in production.

![LangChain Framework](https://raw.githubusercontent.com/langchain-ai/langchain/master/docs/static/svg/langchain_stack_062024.svg)

On the other hand, for example, the OpenAI Python SDK provides a direct interface to OpenAI's API, enabling developers to integrate OpenAI's powerful language models into their applications

The relationship between LangChain and the OpenAI Python SDK is complementary. LangChain leverages the OpenAI Python SDK to access and utilize OpenAI's models, providing a higher-level abstraction that simplifies the integration of these models into more complex workflows and applications.

**Use LangChain to access Azure AI Search easily**

[Azure AI Search](https://python.langchain.com/v0.2/docs/integrations/vectorstores/azuresearch/) interface in LangChain provides a simple and easy way to access Azure AI Search from Python.

**Use RecursiveCharacterTextSplitter to recursively split text by characters**

It is necessary to split text by characters when you need to put text into a search index.
Implementing text splitting by characters is a common task in natural language processing (NLP) and information retrieval (IR) applications but it is tedious and error-prone.
So we introduce `RecursiveCharacterTextSplitter` which provides a simple and easy way to recursively split text by characters. Details are available in the following link.

- [How to recursively split text by characters](https://python.langchain.com/v0.2/docs/how_to/recursive_text_splitter/)

## Usage

1. Get the API key for Azure AI Search
1. Copy [.env.template](../../.env.template) to `.env` in the same directory
1. Set credentials in `.env`
1. Run scripts in the [apps/6_call_azure_ai_search](./) directory

> [!CAUTION]
> `AZURE_AI_SEARCH_INDEX_NAME` in `.env` should be unique and should not be changed once set.
> If you change the index name, you will need to recreate the index and re-upload the documents.

Set up the environment and install dependencies:

```shell
# Create a virtual environment
$ python -m venv .venv
Expand All @@ -24,20 +65,23 @@ $ source .venv/bin/activate

# Install dependencies
$ pip install -r requirements.txt
```

Create an index in Azure AI Search and upload documents:

> [!CAUTION]
> This script should be run only once to avoid creating duplicate indexes.

# Create an index in Azure AI Search and upload documents
# Note: This script should be run only once to avoid creating duplicate indexes
```shell
$ python apps/6_call_azure_ai_search/1_create_index.py
```

# Search documents in Azure AI Search
Search documents in Azure AI Search:

```shell
$ python apps/6_call_azure_ai_search/2_search_index.py

> All meetings must include a 5-minute meditation session.
> All meetings must begin with a joke.
> All meetings must have a theme, such as pirate or superhero.
```

## References

- [How to recursively split text by characters](https://python.langchain.com/v0.2/docs/how_to/recursive_text_splitter/)
- [LangChain > Azure AI Search](https://python.langchain.com/v0.2/docs/integrations/vectorstores/azuresearch/)
41 changes: 40 additions & 1 deletion apps/7_streamlit_chat_rag/README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,41 @@
# Add RAG feature to Streamlit chat app

This app demonstrates how to add a feature to save chat history using Azure Cosmos DB to an Azure OpenAI Chat app using Streamlit.
This app demonstrates how to add the Retrieval-Augmented Generation (RAG) feature to a Streamlit chat app.

## Prerequisites

- Python 3.10 or later
- Azure OpenAI Service
- Azure AI Search

## Overview

**What is RAG?**

Retrieval-Augmented Generation (RAG) is a model that combines the strengths of retrieval and generation models.
It uses a retriever to find relevant passages from a large corpus and then uses a generator to generate a response based on the retrieved passages.
In this way, RAG can generate more accurate and informative responses than traditional generation models.
This chapter provides an practical example of how to use RAG in a Streamlit chat app with the knowledge that you've learned in the following previous chapters.

- LLM: Azure OpenAI Service @ [1_call_azure_openai_chat](../1_call_azure_openai_chat/README.md)
- Chat app: Streamlit @ [2_streamlit_chat](../2_streamlit_chat/README.md)
- Search: Azure AI Search @ [6_call_azure_ai_search](../6_call_azure_ai_search/README.md)

This app just combines the above components to create a chat app with RAG feature.

**Introducing Function calling**

![Function Calling](https://cdn.openai.com/API/docs/images/function-calling-diagram.png)

Function calling is a technical feature that allows you to connect LLM models to external tools and systems. This is useful for many things such as empowering AI assistants with capabilities, or building deep integrations between your applications and the models.

For example, if you want to implement chat app with RAG feature, you can use the function calling feature to connect the LLM model to external knowledge bases or search engines. This allows the model to retrieve relevant information from the knowledge base or search engine and generate a response based on that information.

**Introduce `AgentExecutor` to introduce function calling**

Implementing Function Calling from scratch can be complex and time-consuming. To make it easier, LangChain provides a feature called `AgentExecutor`.
AgentExecutor is a feature that allows you to connect LLM models to external tools and systems. This feature enables you to build deep integrations between your applications and the models, and empower AI assistants with capabilities.

## Usage

1. Get Azure OpenAI Service API key
Expand Down Expand Up @@ -37,3 +65,14 @@ Access `http://localhost:8501` and set the required fields in the sidebar to sta
When you send a question about Contoso Corporation, the chatbot will respond with an answer from Azure AI Search.

![RAG Chat](../../docs/images/7_streamlit_chat_rag.main.png)

To see how the RAG feature works, watch the video below.

[![Streamlit Chat App with RAG Feature](https://img.youtube.com/vi/ummiu-rzYvs/0.jpg)](https://youtu.be/ummiu-rzYvs)

### How to customize

You can customize the chat app by modifying the following codes:

- [main.py](./main.py): `CUSTOM_SYSTEM_PROMPT` variable which defines the system prompt
- [tools/fetch_contents.py](./tools/fetch_contents.py): `fetch_contents` function comments which is passed to the LLM model