-
Clone the Repository:
git clone https://github.com/8BitButter/CropDoc_MinorProject cd CropDoc
-
Create a Virtual Environment (Recommended):
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install Dependencies: Ensure you have PyTorch installed (visit pytorch.org for instructions specific to your system/CUDA version if using GPU). Then install the rest:
pip install -r requirements.txt
Note: The TinyLlama model is relatively small but might still require significant RAM, especially without a GPU. Running the LLM part might be slow on CPU.
-
Download/Place Model Files:
- You need to obtain the
.pth
model files listed in theProject Structure
section. - Create a directory named
models
in the project root. - Place all the
.pth
files inside themodels/
directory.
- You need to obtain the
-
(Optional) GPU Acceleration: For significantly faster LLM inference, ensure you have a compatible NVIDIA GPU, CUDA toolkit, and cuDNN installed, and that your PyTorch installation includes GPU support. The
device_map="auto"
setting inllmres.py
will attempt to use the GPU if available.
- Navigate to the project's root directory in your terminal (where
app.py
is located). - Ensure your virtual environment is activated.
- Run the Streamlit application:
streamlit run app.py
- The application will open in your web browser.
- Upload a crop image using the file uploader.
- Select the appropriate analysis model (General, Cashew, Cassava, Maize, or Tomato) from the dropdown.
- Click the "Analyze Image" button.
- The application will display the uploaded image, the predicted disease class, the confidence score, and detailed advice generated by the LLM.
- Vision Models: The application uses several instances of EfficientNet-B4, each fine-tuned for a specific task:
GeneralModel
: Detects diseases across all supported crops.Cashew
,Cassava
,Maize
,Tomato
: Specialized models for detecting diseases specific to that crop.- Class names for each model are defined within the
ModelManager.MODEL_CONFIG
dictionary inapp.py
.
- Language Model:
TinyLlama/TinyLlama-1.1B-Chat-v1.0
: A compact LLM used for generating contextual agricultural advice. It's loaded locally via thetransformers
library.
Contributions are welcome! Please feel free to submit pull requests or open issues for bugs, feature requests, or improvements.