Skip to content

8BitButter/CropDoc_MinorProject

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Setup and Installation

  1. Clone the Repository:

    git clone https://github.com/8BitButter/CropDoc_MinorProject
    cd CropDoc
  2. Create a Virtual Environment (Recommended):

    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`
  3. Install Dependencies: Ensure you have PyTorch installed (visit pytorch.org for instructions specific to your system/CUDA version if using GPU). Then install the rest:

    pip install -r requirements.txt

    Note: The TinyLlama model is relatively small but might still require significant RAM, especially without a GPU. Running the LLM part might be slow on CPU.

  4. Download/Place Model Files:

    • You need to obtain the .pth model files listed in the Project Structure section.
    • Create a directory named models in the project root.
    • Place all the .pth files inside the models/ directory.
  5. (Optional) GPU Acceleration: For significantly faster LLM inference, ensure you have a compatible NVIDIA GPU, CUDA toolkit, and cuDNN installed, and that your PyTorch installation includes GPU support. The device_map="auto" setting in llmres.py will attempt to use the GPU if available.

Usage

  1. Navigate to the project's root directory in your terminal (where app.py is located).
  2. Ensure your virtual environment is activated.
  3. Run the Streamlit application:
    streamlit run app.py
  4. The application will open in your web browser.
  5. Upload a crop image using the file uploader.
  6. Select the appropriate analysis model (General, Cashew, Cassava, Maize, or Tomato) from the dropdown.
  7. Click the "Analyze Image" button.
  8. The application will display the uploaded image, the predicted disease class, the confidence score, and detailed advice generated by the LLM.

Models

  • Vision Models: The application uses several instances of EfficientNet-B4, each fine-tuned for a specific task:
    • GeneralModel: Detects diseases across all supported crops.
    • Cashew, Cassava, Maize, Tomato: Specialized models for detecting diseases specific to that crop.
    • Class names for each model are defined within the ModelManager.MODEL_CONFIG dictionary in app.py.
  • Language Model:
    • TinyLlama/TinyLlama-1.1B-Chat-v1.0: A compact LLM used for generating contextual agricultural advice. It's loaded locally via the transformers library.

Contributing

Contributions are welcome! Please feel free to submit pull requests or open issues for bugs, feature requests, or improvements.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages