A production-ready chatbot application built with Databricks Model Serving endpoint (works with AgentBricks) and LakeBase, featuring a modern Streamlit interface with chat history persistence and real-time streaming responses.
This application demonstrates an end-to-end conversational AI solution leveraging the Databricks ecosystem:
- π§± AgentBricks: Powered by Databricks' Agent Framework, supporting multiple agent types including ChatAgent, ResponsesAgent, and standard chat completions
- ποΈ LakeBase: PostgreSQL-backed chat history persistence using Databricks LakeBase for reliable data storage
- β‘ Streamlit: Modern, responsive UI with real-time streaming responses
- π OAuth Integration: Secure authentication via Databricks Workspace OAuth tokens
- Multi-Agent Support: Compatible with
chat/completions,agent/v2/chat, andagent/v1/responsesendpoint types - Streaming Responses: Real-time token streaming for responsive user experience
- Tool Calling: Full support for function/tool calling with visual feedback
- Feedback System: Built-in thumbs up/down feedback mechanism for continuous improvement
- Persistent Storage: All conversations saved to LakeBase PostgreSQL database
- Historical View: Browse and view past conversations from the sidebar
- Database Resilience: Graceful fallback if database is unavailable
- Modern UI: Clean, intuitive interface with emoji indicators
- Context Display: View metadata including request IDs, endpoints, and timestamps
- Error Handling: Automatic retry with non-streaming fallback on errors
βββββββββββββββββββ
β Streamlit UI β
ββββββββββ¬βββββββββ
β
ββββΊ AgentBricks Serving Endpoint
β (ChatAgent/ResponsesAgent/ChatModel)
β
ββββΊ LakeBase PostgreSQL
(Chat History Storage)
- Databricks Workspace: Access to a Databricks workspace with Model Serving enabled
- Serving Endpoint: A deployed AgentBricks endpoint with
CAN_QUERYpermissions - LakeBase: PostgreSQL database credentials configured in environment variables
- Python: Version 3.9 or higher
-
Clone the repository
git clone <repository-url> cd e2e-chatbot-app
-
Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh -
Install dependencies
uv sync
-
Configure environment variables
Create a
.envfile with the following variables:# Serving Endpoint SERVING_ENDPOINT=your-endpoint-name # LakeBase PostgreSQL Configuration PGDATABASE=your_database_name PGUSER=your_username PGHOST=your_host.cloud.databricks.com PGPORT=5432 PGSSLMODE=require PGAPPNAME=chatbot_app
-
Run the application
uv run streamlit run app.py
-
Clone the repository
git clone <repository-url> cd e2e-chatbot-app
-
Install dependencies
pip install -r requirements.txt
-
Configure environment variables (same as above)
-
Run the application
streamlit run app.py
This application is designed for seamless deployment as a Databricks App:
-
Configure
app.yamlThe included
app.yamlfile defines the deployment configuration:command: ["streamlit", "run", "app.py"] env: - name: STREAMLIT_BROWSER_GATHER_USAGE_STATS value: "false" - name: "SERVING_ENDPOINT" valueFrom: "serving-endpoint"
-
Deploy using Databricks CLI
databricks apps create chatbot-app \ --source-code-path . \ --config app.yaml -
Grant Permissions
Ensure the app has
CAN_QUERYpermissions on your serving endpoint.
e2e-chatbot-app/
βββ app.py # Main Streamlit application
βββ messages.py # Message classes for chat interface
βββ model_serving_utils.py # AgentBricks endpoint utilities
βββ requirements.txt # Python dependencies
βββ app.yaml # Databricks app configuration
βββ .env # Environment variables (not committed)
βββ README.md # This file
The application intelligently detects and adapts to different endpoint types:
- ChatAgent (
agent/v2/chat): Streaming chat with tool calling support - ResponsesAgent (
agent/v1/responses): Event-based streaming with function calls - Chat Completions: Standard OpenAI-compatible chat interface
Chat interactions are persisted in a PostgreSQL database with the following schema:
CREATE TABLE {schema}.chat_history (
id SERIAL PRIMARY KEY,
user_message TEXT NOT NULL,
assistant_response TEXT NOT NULL,
request_id TEXT,
endpoint_name TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)The schema is dynamically created based on PGAPPNAME and PGUSER to support multi-tenancy.
PostgreSQL credentials use OAuth tokens that are automatically refreshed every 15 minutes:
def refresh_oauth_token():
workspace_client.config.oauth_token().access_token- Launch the app and enter a question in the chat input
- View the streaming response in real-time
- Provide feedback using thumbs up/down
When the agent needs to call tools, you'll see:
π οΈ Calling `get_weather` with:
{
"location": "San Francisco"
}
- Click π Refresh to reload recent conversations
- Click any conversation to view its full contents
- Click π New Chat to start a fresh conversation
If you see "
- Verify LakeBase credentials in
.env - Check OAuth token refresh settings
- Ensure network connectivity to PostgreSQL host
If responses fail:
- Verify
SERVING_ENDPOINTenvironment variable - Check endpoint permissions (
CAN_QUERY) - Review endpoint logs in Databricks workspace
The app automatically falls back to non-streaming mode if streaming fails.
Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Submit a pull request
This project is licensed under the MIT License.
- Databricks Agent Framework Documentation
- Databricks Apps Documentation
- LakeBase Documentation
- Streamlit Documentation
For issues and questions:
- Open an issue in this repository
- Contact the Databricks support team
- Check the Databricks community forums
Built with β€οΈ using Databricks AgentBricks and LakeBase
