Demo showcasing local AI using Azure Foundry Local, LangChain, OpenAI-compatible APIs, and Microsoft Bot Framework (Python).
-
- Run locally
- Bot interaction
- LangChain integration
This project combines:
- Microsoft Bot Framework (Python) – handles chat interface.
- Azure Foundry Local – runs LLMs on-device via OpenAI-compatible endpoint learn.microsoft.com.
- LangChain – chains prompt templates with local LLM .
- OpenAI-style API – uses
openai
Python SDK to instruct Foundry local learn.microsoft.com.
Ensure the following are installed:
- Python 3.10+
- Foundry Local installed (Windows/macOS)
- Bot Framework SDK for Python
pip
–managed virtual environment (recommended:venv
orconda
)
# Clone the repo
git clone https://github.com/MusaddiqueHussainLabs/foundry_local_demo.git
cd foundry_local_demo
# Create virtual env
python3 -m venv .venv
source .venv/bin/activate # macOS/Linux
.\.venv\Scripts\activate # Windows
# Install required packages
pip install -r requirements.txt
pip install foundry-local-sdk langchain[openai] botbuilder-integration-aiohttp
Set up environment variables for Foundry Local:
export FOUNDRY_ALIAS="phi-3-mini-4k"
export BOT_APP_ID=""
export BOT_APP_PASSWORD=""
FOUNDRY_ALIAS
: Foundry model alias. Tip: list models viafoundry model list
learn.microsoft.com.- Bot credentials: required for Microsoft Bot Framework.
foundry model run $FOUNDRY_ALIAS
This starts a local server, downloads a hardware-optimized model.
python app.py
Starts a bot endpoint (e.g., at http://localhost:3978/api/messages
).
Use Bot Framework Emulator or integrate in Teams. Type in natural-language prompts and receive responses via Foundry + LangChain.
.
├── app.py # Bot server + message routing
├── bot.py # LangChain setup (prompt → LLM → response)
├── config.py # configurations
├── requirements.txt # Python dependencies
└── README.md # This file
-
Foundry Local
- Starts locally via
FoundryLocalManager
, loads a model alias learn.microsoft.com. - Exposes OpenAI-compatible endpoint at
http://localhost:<port>/
.
- Starts locally via
-
LangChain Integration
ChatPromptTemplate
→ChatOpenAI(llm=foundry endpoint)
learn.microsoft.com.
-
Bot Framework Integration
app.py
routes messages: user input → LangChain chain → LLM → response back to user.
-
OpenAI SDK usage
- Supports OpenAI-style calls (
openai.chat.completions.create
) for compatibility learn.microsoft.com.
- Supports OpenAI-style calls (
- Foundry Local overview & CLI Quickstart learn.microsoft.com