A full-stack, Databricks-themed conversational assistant for supply chain queries, powered by OpenAI Agents and Databricks MCP servers. Includes a React chat UI and a FastAPI backend that streams agent responses.
- Conversational chat UI (React) with Databricks red palette
- FastAPI backend with streaming
/chat
endpoint - Secure Databricks MCP integration
- Example agent logic and tool usage
- Modern UX, easy local development
/ (root)
├── main.py # Example CLI agent runner
├── api_server.py # FastAPI backend for chat UI
├── requirements.txt # Python dependencies
├── ui/ # React frontend (Vite)
│ ├── src/components/ChatUI.jsx, ChatUI.css, ...
│ └── ...
└── README.md # (this file)
You can kick start your project with Databricks’ Supply-Chain Optimization Solution Accelerator (or any other accelerator if working in a different industry). Clone this accelerator’s GitHub repo into your Databricks workspace and run the bundled notebooks by running notebook 1:
https://github.com/lararachidi/agent-supply-chain/blob/main/README.md
These notebooks stand up every asset the Agent will later reach via MCP, from raw enterprise tables and unstructured e-mails to classical ML models and graph workloads.
- Python 3.10+
- Node.js 18+
- Databricks credentials in
~/.databrickscfg
- (Optional) Virtualenv/pyenv for Python isolation
pip install -r requirements.txt
To kick off the backend, run:
python -m uvicorn api_server:app --reload --port 8000
- The API will be available at http://localhost:8000
- FastAPI docs: http://localhost:8000/docs
In a different terminal, run the following:
cd ui
npm install
npm run dev
- The app will be available at http://localhost:5173
- Open http://localhost:5173 in your browser.
- Type a supply chain question (e.g., "What are the delays with distribution center 5?") and hit Send.
- The agent will stream back a response from the Databricks MCP server.
- Port already in use: Kill old processes with
lsof -ti:8000 | xargs kill -9
(for backend) or change the port. - Frontend not loading: Make sure you ran
npm install
andnpm run dev
in theui/
folder.
- To change the agent's greeting, edit
ui/src/components/ChatUI.jsx
. - To update backend agent logic, modify
api_server.py
. - UI styling is in
ui/src/components/ChatUI.css
(Databricks red palette).
- Inspired by Databricks Supply Chain Solution Accelerator
- Uses openai-agents-python
- Databricks MCP integration via databricks.sdk
- Supply-chain scope enforced by a simple LLM guardrail (see
supply_chain_guardrails.py
)