Skip to content

Praveen-koujalagi/Subjective-Answer-Evaluation-System

Repository files navigation

📝 Subjective Answer Evaluation System

This web-based application evaluates scanned subjective answer sheets by comparing student responses with model answers using Optical Character Recognition (OCR) and Natural Language Processing (NLP) techniques. It supports PDF file uploads for both student and model answer sheets.

License Live Demo


🚀 Features

  • Upload PDF Answer Sheets: Upload scanned PDF files containing student answers.
  • OCR for Text Extraction: Extracts text from the images in the PDF using Tesseract OCR.
  • Text Comparison: Compares the extracted text with model answers to evaluate accuracy.
  • Model Accuracy Score: The application calculates an accuracy score based on the comparison between the student's answer and the model answer.

🧰 Tech Stack

Python Streamlit OCR NLP PDF


🛠️ Getting Started

To run this project locally, follow these instructions:

✅ Prerequisites

  • Python 3.x
  • Tesseract OCR installed and configured in PATH
  • Required Python libraries (listed in requirements.txt)

📦 Installing Dependencies

  1. Clone the repository:

    git clone https://github.com/Praveen-koujalagi/Subjective-Answer-Evaluation-System.git
    cd Subjective-Answer-Evaluation-System
  2. Create a virtual environment:

    python -m venv env
  3. Activate the virtual environment:

    • On Windows:
      .\env\Scripts\activate
    • On macOS/Linux:
      source env/bin/activate
  4. Install the dependencies:

    pip install -r requirements.txt

▶️ Running the Application

To run the app locally:

streamlit run app.py

👥 Team

  • Praveen Koujalagi
  • S Sarvesh Balaji
  • Sujit G

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages