A microforum web application with role-based access, sentiment analysis, and an admin dashboard for analytics. Built with FastAPI, React, RabbitMQ, and Docker.
- Overview
- Features
- Tech Stack
- Architecture
- Database Schema
- API Endpoints
- Sentiment Analysis
- Setup & Installation
- Usage
- Docker
- Contributing
- License
MicroForum is a modern forum application supporting user authentication, post and comment creation, sentiment analysis, and an admin dashboard for analytics. It is designed for easy deployment and scalability using Docker.
- JWT-based authentication (signup/login)
- Role-based access (normal user/admin)
- Create, view, and delete posts and comments
- Sentiment analysis for posts and comments
- Admin dashboard with sentiment analytics (pie, line, bar charts)
- Asynchronous processing with RabbitMQ
- Containerized with Docker
- Backend: FastAPI, SQLAlchemy, Pydantic, RabbitMQ
- Frontend: React.js, Tailwind CSS
- Database: SQLite
- AI/NLP: HuggingFace Transformers (Twitter-RoBERTa sentiment model)
- Containerization: Docker, Docker Compose
- API Layer: Handles HTTP requests and responses (app/api/)
- Service Layer: Business logic and async jobs (app/services/)
- Repository Layer: Database operations (app/repositories/)
- Models: SQLAlchemy ORM models (app/models/)
- Schemas: Pydantic data validation (app/schemas/)
- Message Queue: RabbitMQ for async sentiment analysis
This project uses a transformer-based sentiment analysis model from HuggingFace:
- Model: cardiffnlp/twitter-roberta-base-sentiment
- Type: RoBERTa, pre-trained and fine-tuned for sentiment analysis on social media and short text (Twitter).
- Why this model?
- Outperformed other tested models (BERTweet, DistilBERT SST-2) in accuracy and speed for forum-style and social text.
- Provides three sentiment classes: Positive, Neutral, Negative.
- Easy integration with Python using the
transformers
library.
- Message Queue: RabbitMQ is used to handle asynchronous tasks, specifically for sentiment analysis.
- How it works:
- When a post or comment is created, a message is published to a queue.
- A background worker consumes messages from the queue and performs sentiment analysis using the AI model.
- The sentiment result is then stored back in the database.
- Benefits:
- Decouples user-facing API from heavy AI processing.
- Improves responsiveness and scalability.
- Client sends a request (e.g., create post/comment) to the API Layer (FastAPI).
- API Layer validates and forwards the request to the Service Layer.
- Service Layer handles business logic and, if needed, publishes a message to RabbitMQ for async processing.
- Repository Layer manages database operations (CRUD).
- Worker listens to the message queue, processes sentiment analysis, and updates the database.
- Client can fetch results (posts/comments with sentiment) via API endpoints.
Three transformer models were tested:
- BERTweet (
finiteautomata/bertweet-base-sentiment-analysis
) - Twitter-RoBERTa (
cardiffnlp/twitter-roberta-base-sentiment
) ← Selected for best accuracy and speed - DistilBERT SST-2 (
distilbert-base-uncased-finetuned-sst-2-english
)
Twitter-RoBERTa was chosen for its superior performance on social/forum text.
- Clone the repository:
git clone <repo-url> cd microforum-dev
- Backend Setup:
cd MicroForum/backend pip install -r requirements.txt # Set up environment variables as needed
- Frontend Setup:
cd ../frontend npm install
- Access the frontend at
http://localhost:3000
- API available at
http://localhost:8000/docs
- Admin dashboard available after login as admin
- Fork the repository
- Create a new branch (
git checkout -b feature/your-feature
) - Commit your changes (
git commit -am 'Add new feature'
) - Push to the branch (
git push origin feature/your-feature
) - Open a Pull Request
MIT License