An AI-powered Library Management Assistant built with FastAPI, LangChain, and Streamlit.
It helps manage books, customers, and orders through natural language queries.
- 🤖 GPT-powered intelligent assistant (LangChain Agent)
- ⚡ FastAPI backend with tool orchestration
- 🖥️ Streamlit interactive frontend
- 🗄️ PostgreSQL database with SQLAlchemy ORM
- 🔄 Auto tool chaining (find_books → restock_book → create_order → ...)
- 💬 Persistent chat sessions with database logging
GenAI_Assessment/
├── app/
│ └── chat_ui.py # Streamlit frontend
├── server/
│ ├── main.py # FastAPI + Streamlit launcher
│ ├── db.py # Database connection
│ ├── models/ # SQLAlchemy models
│ ├── agent/
│ │ ├── chains/ # LangChain agent logic
│ │ └── tools/ # Custom AI tools
│ ├── routes/ # FastAPI endpoints
│ └── config.py # Environment settings
├── db/
│ └── seed.py # Seed script to populate sample data
└── README.md
git clone <your-repo-url>
cd GenAI_Assessmentconda create -n desk-agent python=3.11 -y
conda activate desk-agentpip install -r requirements.txtCreate a .env file in the project root:
APP_NAME=""
APP_VERSION=""
OPENAI_API_KEY=""
POSTGRES_DB_URL=""
POSTGRES_DB_NAME=""
POSTGRES_DB_USER=""
POSTGRES_DB_PASSWORD=""
API_URL=""Environment Variables:
APP_NAME: Application name displayed in the UIAPP_VERSION: Current version of the applicationOPENAI_API_KEY: Your OpenAI API key for GPT integrationPOSTGRES_DB_URL: Full PostgreSQL connection stringPOSTGRES_DB_NAME: Database name (e.g.,library_db)POSTGRES_DB_USER: Database usernamePOSTGRES_DB_PASSWORD: Database passwordAPI_URL: Backend API endpoint for chat requests
When you run the application for the first time, all database tables are created automatically through SQLAlchemy ORM migrations.
To populate initial books, customers, and sample data:
cd ~/path/to/your/project
python -m db.seedIf you encounter any database issues, you can manually create the schema using the provided SQL file:
psql -U your_username -d library_db -f seed.sqlThis will create all necessary tables and constraints for the application.
You can run FastAPI and Streamlit together from one terminal:
python -m server.mainOnce launched:
- FastAPI runs on →
http://127.0.0.1:5000 - Streamlit UI runs on →
http://127.0.0.1:8501
You can test directly in the Streamlit chat or via API calls.
{ "query": "Find books by Robert C. Martin" }{ "query": "Restock Clean Code book by 45" }{ "query": "Create an order for customer 2 for 2 copies of Clean Code" }{ "query": "Update the price of Clean Code to 50 dollars" }{ "query": "What is the status of order 2?" }{ "query": "Show me all books that are running low on stock" }| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Health Check |
/chat |
POST | Send a query to the AI agent |
/sessions |
GET | List all chat sessions |
/messages/{session_id} |
GET | Retrieve chat history for a specific session |
Mohannad Hendi
Senior AI Engineer • Automation Specialist • Data Scientist