Skip to content

TheTatane/ai-agent-streamlit-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Agent Chatbot (Agno + Steamlit) : Simplest Demo

A simple and containerized chabot demo using Agno AgentOS.

🏷️ Description

This project is a chatbot designed to interact with Agents running on Agno AgentOS (multiple AI Agents and MongoDB storage).

Depending on the selected tools, the agent can utilize different LLM models, tools, and instructions. The chatbot is capable of managing both text and images. The available tools include Search (using DuckDuckGo) and MCP (using Airbnb).

This project is intended for exploration and community sharing.

📍 Index

✅ Features

  • Interaction with Agents running on Agno AgentOS server (backend)
  • Selection of different LLM models, tools, and instructions for the agent
  • Text and image management
  • Available tools:
    • Search (DuckDuckGo)
    • MCP (Airbnb)

Demo

Anlayse image

Multimodal

Search the web

DuckDuckGo Tools

Airbnb MCP

MCP Airbnb

🔜 Soon

  • More MCPTools exemples
  • TTS (Text-to-Speech) Implementation
  • STT (Speech-to-Text) Implementation
  • Streaming AI Agent Responses
  • Upload a PDF or Word file to interact with its content

📋 Prerequisites

  • Linux (AMD & Nvidia) or Windows with WSL2 (Nvidia)
  • Docker installed
  • Docker Compose installed

🛠️ Installation

AMD

Instructions for installing the project, use compose.amd.yml:

docker compose -f compose.amd.yml build --no-cache
docker compose -f compose.amd.yml up -d

NVIDIA

Instructions for installing the project, use compose.nvidia.yml:

docker compose -f compose.nvidia.yml build --no-cache
docker compose -f compose.nvidia.yml up -d

Check installation

docker ps

You should see 4 services :

  • mongodb_service
  • ollama_service
  • backend_agentos
  • frontend_streamlit

Output exemple :

IMAGE                COMMAND                  STATUS          PORTS                                             NAMES
streamlit-frontend   "streamlit run app/m…"   Up 14 minutes   0.0.0.0:8501->8501/tcp, [::]:8501->8501/tcp       frontend_streamlit
streamlit-backend    "fastapi run app/mai…"   Up 14 minutes   0.0.0.0:8000->8000/tcp, [::]:8000->8000/tcp       backend_agentos
ollama/ollama:rocm   "/bin/ollama serve"      Up 17 minutes   0.0.0.0:11434->11434/tcp, [::]:11434->11434/tcp   ollama_service
mongo:latest         "docker-entrypoint.s…"   Up 17 minutes   0.0.0.0:27017->27017/tcp, [::]:27017->27017/tcp   mongodb_service

Pull Ollama models

Once service are installed and up, a LLM models is required. You can change the name and options of the LLM models in the file : backend/app/config.yaml

For the basic configuration follow thoses steps :

Check models available in your Ollama container

docker exec -it ollama_service ollama list

Download a model to your Ollama container

docker exec -it ollama_service ollama pull devstral-small-2:24b
docker exec -it ollama_service ollama pull ministral-3:8b

🗑️ Delete

Instructions for Uninstalling the Project (choose the same compse file as the setup)

AMD :

docker compose -f compose.amd.yml down

NVIDIA :

docker compose -f compose.nvidia.yml down

💻 Technology

Technology Description License Documentation
Python A high-level programming language. MIT Docs
Docker A platform for containerization. Apache 2.0 Docs
MongoDB A document-oriented, operational database SSPL Docs
Ollama A tool for running large language models locally. MIT Docs
Agno A unified stack for multi-agent systems. Apache 2.0 Docs

📂 Project structure

project/
│
├── backend/
│   ├── app/
│   │   ├── core/
│   │   │   └── agent.py
│   │   ├── utils/
│   │   │   └── param.py
│   │   └── main.py
│   ├── requirements.txt
│   ├── Dockerfile
│   └── config.yaml
│
├── frontend/
│   ├── app/
│   │   ├── assets/
│   │   │   └── image/
│   │   │       └── logo.jpg
│   │   ├── files/
│   │   ├── pages/
│   │   │   ├── chatbot.py
│   │   │   └── pdf.py
│   │   ├── utils/
│   │   │   ├── file.py
│   │   │   ├── curl.py
│   │   │   ├── time.py
│   │   │   ├── utils.py
│   │   │   └── widget.py
│   │   └── main.py
│   ├── requirements.txt
│   └── Dockerfile
│
├── compose.amd.yml
├── compose.nvidia.yml
└── README.md

🧪 Test

This project has been tested on the following hardware configuration:

  • OS: Linux Solus
  • CPU: AMD Ryzen 5 9600X
  • GPU: AMD Radeon Pro 9700 AI & AMD RX 9070
  • Storage: 2.5" SATA SSD

🚨 Warning

No authentication (password or JWT verification key) is configured for service connections (MongoDB or AgentOS).

📃 License

This project is open source and the code is usable and modifiable. However, the author disclaims all responsibility and no technical support is provided.

About

A simple and containerized chabot demo using Agno (AgentOS).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors