A modern, beautifully-designed AI chatbot built using OpenAI LLMs, LangChain prompt chaining, and LangSmith monitoring—all wrapped inside a polished Streamlit UI with a glassmorphism background.
✨ **Production-Ready LLM Chatbot with End-to-End Monitoring** ✨
- 🌟 Key Features
- 📸 UI Preview
- 🧱 Architecture
- ⚙️ Installation & Setup
- 🔐 API Keys & Environment Variables
▶️ Run the Chatbot- 📂 Project Structure
- 🧪 Future Enhancements
- 🙌 Credits
- 📄 License
This project serves as a robust, full-stack demo showcasing best practices for building an LLM application.
| Feature | Description | Benefits |
|---|---|---|
| LangSmith Tracing | Track prompt chains, latency, and token usage for every request. | Observability & Debugging in production. |
| LangChain Integration | Utilizes ChatPromptTemplate and ChatOpenAI for clean, modular code. |
Maintainability and easy chain definition. |
| Multi-Model Support | Switch between gpt-4o-mini, gpt-4o, and gpt-3.5-turbo in real-time. |
Flexibility and Cost Optimization. |
| Glassmorphism UI | Custom CSS for a modern, blurred, responsiveStreamlit interface. | Polished user experience for demos/portfolio. |
| Chat Management | Full session history and .txt transcript download. |
User Utility and Data Export. |
| Secure Handling | Uses .env for secure API key management. |
Security best practice. |
A clean, responsive UI built with custom CSS and the glassmorphism aesthetic.
The chatbot utilizes a standard, traceable LLM Chain architecture, monitored end-to-end by LangSmith.
graph TD
A[Streamlit UI] --> B(User Input / Session State);
B --> C[LangChain Prompt Template];
C --> D[ChatOpenAI / LLM Call];
D --> E[LangSmith Tracing];
E --> F[Output Parser / Response];
F --> B;
- Streamlit UI handles user input and displays the chat history.
- The input is passed to a LangChain object that manages the
ChatPromptTemplate. - The request is sent to ChatOpenAI (GPT models).
- LangSmith monitors the entire chain, logging the prompt, response, latency, and token consumption.
Follow these steps to get the project running on your local machine.
1️⃣ Clone the Repository
git clone [https://github.com/mubasshirahmed-3712/openai-langchain-chatbot.git](https://github.com/mubasshirahmed-3712/openai-langchain-chatbot.git)
cd openai-langchain-chatbot
Create a file named .env in the root directory and add your keys.
| Variable | Description |
|---|---|
OPENAI_API_KEY |
Your key for accessing OpenAI's LLMs. |
LANGCHAIN_API_KEY |
Your key for connecting to LangSmith. |
LANGCHAIN_TRACING_V2 |
Set to trueto enable end-to-end tracing with LangSmith. |
.env file example:
OPENAI_API_KEY=sk-xxxx
LANGCHAIN_API_KEY=lsv2_xxxx
LANGCHAIN_TRACING_V2=true
.env file is already ignored by .gitignore for security.
Start the Streamlit application from your terminal:
streamlit run Llm_app.py
The application will open in your web browser at:
👉 http://localhost:8501
openai-langchain-chatbot/
├── assets/
│ ├── UI_Bg1.jpg # Background images for UI
│ └── UI_Bg2.jpg
├── raw_scripts/
│ └── app.py # Raw script backup/reference
├── Llm_app.py # 🚀 Main Streamlit application file
├── requirements.txt # Python package dependencies
├── .env # Secure API key storage (ignored)
└── UI_Overview.png # UI Preview image
The following features are planned to expand the chatbot's capabilities:
- Tool-Use/Agent: Integrate tools like Web Search and Calculator.
- RAG: Implement Retrieval-Augmented Generation with local PDFs or documentation.
- Conversation Memory: Switch to robust JSON-backed memory for persistence.
- TTS Output: Add Text-to-Speech functionality for verbal responses.
A special thanks to the following technologies and their creators:
This project is licensed under the MIT License .
⭐ Star this repository if you found it useful!
