You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repo contains a growing suite of example applications for <ahref="https://github.com/meta-llama/llama-stack">Llama Stack</a> that demonstrate various stack features and common application patterns:
3
+
This repo contains a growing suite of example applications for <ahref="https://github.com/meta-llama/llama-stack">Llama Stack</a> that demonstrate various stack features and common application patterns.
4
+
5
+
See the corresponding README for each example for more information. Here we summarize the examples.
6
+
7
+
## Application Examples
8
+
9
+
These applications are in the [`apps`](apps/) directory.
4
10
5
11
*[`01-chatbot`](apps/01-chatbot): A getting-start chatbot app, which shows how to build and deploy Llama Stack applications. It includes two different UI options and inference with an [ollama](https://ollama.com)-hosted [Llama 3](https://www.llama.com/models/llama-3/) model.
6
-
*[`02-deep-research`](apps/02-deep-research/README.md): A _deep research_ app (under development), which illustrates an emerging, common application pattern for AI. The user asks for detailed information about a topic, for example the market performance and financials for a publicly-traded company, agents find relevant data from diverse sources, and finally an LLM digests the information retrieved and prepares a report. This example will demonstrate Llama Stack support for agent-based application development, including the use of protocols like [MCP](https://modelcontextprotocol.io/introduction).
12
+
*[`02-deep-research`](apps/02-deep-research): A _deep research_ app (under development), which illustrates an emerging, common application pattern for AI. The user asks for detailed information about a topic, for example the market performance and financials for a publicly-traded company, agents find relevant data from diverse sources, and finally an LLM digests the information retrieved and prepares a report. This example will demonstrate Llama Stack support for agent-based application development, including the use of protocols like [MCP](https://modelcontextprotocol.io/introduction).
13
+
14
+
## Notebook Examples
15
+
16
+
These examples use Jupyter notebooks to illustrate their concepts. They are located in the [`notebooks`](notebooks/) directory.
7
17
8
-
See the READMEs for each example for more information.
18
+
*[`01-responses`](notebooks/01-responses): This notebook demonstrates how to use the Llama Stack _Responses_ API for simple inference, Retrieval-Augmented Generation (RAG), and Model Context Protocol (MCP) tool calling.
9
19
10
20
> [!NOTE]
11
-
> **Please join us!** We welcome [PRs](https://github.com/The-AI-Alliance/llama-stack-usecase1/pulls) and suggestions as [issues](https://github.com/The-AI-Alliance/llama-stack-usecase1/issues). Use the [discussions](https://github.com/The-AI-Alliance/llama-stack-usecase1/discussions) for general questions and suggestions. For more information about joining this project or other AI Alliance projects, go [here](https://the-ai-alliance.github.io/contributing/).
21
+
> **Please join us!** We welcome new examples. You can submit them or submit improvements to the current examples using [PRs](https://github.com/The-AI-Alliance/llama-stack-usecase1/pulls), make suggestions or report bugs as [issues](https://github.com/The-AI-Alliance/llama-stack-usecase1/issues), or use the [discussions forum](https://github.com/The-AI-Alliance/llama-stack-usecase1/discussions) for general questions and suggestions. For more information about joining this project or other AI Alliance projects, go [here](https://the-ai-alliance.github.io/contributing/). The main AI Alliance website is [here](https://aialliance.org).
12
22
>
13
23
> If you are interested in running Llama Stack on Kubernetes or OpenShift, see [these examples from opendatahub.io](https://github.com/opendatahub-io/llama-stack-demos).
Copy file name to clipboardExpand all lines: notebooks/01-responses/README.md
+33-6Lines changed: 33 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,32 @@ The Responses API provides a unified interface for AI interactions that can hand
22
22
23
23
## Instructions
24
24
25
-
To see all the prerequisites for running the notebook and how to meet them, see the instructions at the start of the notebook.
25
+
To see all the prerequisites for running the notebook and how to meet them, see the instructions at the start of the notebook. Set up a Python 3.12 environment, e.g., using Anaconda, `venv`, or similar tool. Then do the following steps in one terminal window:
26
+
27
+
```shell
28
+
# Install the dependencies:
29
+
pip install -r requirements.txt
30
+
# Run the Llama Stack server
31
+
llama stack run run.yaml --image-type venv --port 8321
32
+
```
33
+
34
+
In a second window, run the National Park Service MCP server:
Open the notebook [`reponses-api.ipynb`](reponses-api.ipynb) using [Jupyter](https://jupyter.org/), which is not installed as part of the `requirements.txt`. You can install Jupyter Lab as follows:
41
+
42
+
```shell
43
+
pip install jupyterlab
44
+
```
45
+
46
+
Then run the lab environment and open the notebook in the left-hand side browser:
47
+
48
+
```shell
49
+
jupyter lab
50
+
```
26
51
27
52
## What You'll Learn
28
53
@@ -61,17 +86,19 @@ The notebook demonstrates three main approaches to using the Responses API:
61
86
echo$OPENAI_API_KEY# Should show your key
62
87
```
63
88
64
-
3.**Python version issues**: Ensure you're using Python 3.12+:
89
+
3.**Connection Error**: You might need to change the port used for Llama Stack. Notice the URL printed in the terminal window where you are running Llama Stack. Make sure `LLAMA_STACK_URL` uses the same port number in the notebook.
90
+
91
+
4.**Python version issues**: Ensure you are using Python 3.12+:
65
92
```bash
66
93
python --version
67
94
```
68
95
69
-
4.**Dependency conflicts**: Use a fresh virtual environment if you encounter package conflicts.
96
+
5.**Dependency conflicts**: Use a fresh virtual environment if you encounter package conflicts.
70
97
71
98
## Support
72
99
73
100
This notebook and README were developed with assistance from Google Gemini and Cursor using Claude Sonnet 4. For issues with:
74
101
75
-
-**Llama Stack**: Check the [official documentation](https://llama-stack.readthedocs.io/)
76
-
-**MCP Protocol**: See [Model Context Protocol docs](https://modelcontextprotocol.io/)
77
-
-**This example**: Review the notebook cells for detailed explanations
102
+
-**The notebook itself**: Review the notebook cells for detailed explanations that may help. Post an [issue](https://github.com/The-AI-Alliance/llama-stack-examples/issues) or open a [discussion](https://github.com/The-AI-Alliance/llama-stack-examples/discussions) if you need more help.
103
+
-**Llama Stack**: Check the [official documentation](https://llama-stack.readthedocs.io/).
104
+
-**MCP Protocol**: See [Model Context Protocol docs](https://modelcontextprotocol.io/).
0 commit comments