|
1 | | -# deepforest-agent |
| 1 | +# DeepForest Multi-Agent System |
| 2 | + |
| 3 | +The DeepForest Multi-Agent System provides ecological image analysis by orchestrating multiple AI agents that work together to understand ecological images. Simply upload an image of a forest, wildlife habitat, or ecological scene, and ask questions in natural language. |
| 4 | + |
| 5 | +## Installation |
| 6 | + |
| 7 | +### 1. Clone the repository |
| 8 | + |
| 9 | +```bash |
| 10 | +git clone https://github.com/weecology/deepforest-agent.git |
| 11 | +cd deepforest-agent |
| 12 | +``` |
| 13 | + |
| 14 | +### 2. Create and activate a Conda environment |
| 15 | + |
| 16 | +```bash |
| 17 | +conda create -n deepforest_agent python=3.12.11 |
| 18 | +conda activate deepforest_agent |
| 19 | +``` |
| 20 | + |
| 21 | +### 3. Install dependencies |
| 22 | + |
| 23 | +```bash |
| 24 | +pip install -r requirements.txt |
| 25 | +pip install -e . |
| 26 | +``` |
| 27 | + |
| 28 | +### 4. Configure the HuggingFace Token |
| 29 | +Create a `.env` file in the root directory of the deepforest-agent project and add your HuggingFace token like below: |
| 30 | + |
| 31 | +```bash |
| 32 | +HF_TOKEN="your_huggingface_token_here" |
| 33 | +``` |
| 34 | + |
| 35 | +You can obtain your token from [HuggingFace Access Token](https://huggingface.co/settings/tokens). Make sure the Token type is "Write". |
| 36 | + |
| 37 | +## Usage |
| 38 | + |
| 39 | +The DeepForest Agent runs through a Gradio web interface. To start the interface, execute: |
| 40 | + |
| 41 | +```bash |
| 42 | +python -m deepforest_agent.app |
| 43 | +``` |
| 44 | + |
| 45 | +A link like http://127.0.0.1:7860 will appear in the terminal. Open it in your browser to interact with the agent. A public Gradio link may also be provided if available. |
| 46 | + |
| 47 | + |
| 48 | +## Features |
| 49 | + |
| 50 | +- **Multi-Species Detection**: Automatically detects trees, birds, and livestock using specialized DeepForest models |
| 51 | +- **Tree Health Assessment**: Identifies alive and dead trees using DeepForest Tree Detector whenever user asks. |
| 52 | +- **Visual Analysis**: Dual analysis of original and annotated images using Qwen2.5-VL-3B-Instruct model |
| 53 | +- **Memory Context**: Maintains conversation history for contextual understanding across multiple queries |
| 54 | +- **Ecological Insights**: Synthesizes detection data with visual analysis for comprehensive ecological understanding |
| 55 | +- **Streaming Responses**: Real-time updates as each agent processes your query |
| 56 | + |
| 57 | + |
| 58 | +## Requirements |
| 59 | + |
| 60 | +### Hardware Requirements |
| 61 | +- **GPU**: GPU with at least 24GB VRAM (recommended for optimal performance) |
| 62 | +- **Storage**: At least 16GB free space for model downloads |
| 63 | + |
| 64 | +### API Requirements |
| 65 | +- **HuggingFace Token**: Required for model access. |
| 66 | + |
| 67 | + |
| 68 | +## Models Used |
| 69 | + |
| 70 | +- **SmolLM3-3B**: For Memory Agent to get context, and for Detector Agent to call the tool with appropriate parameters |
| 71 | +- **Qwen2.5-VL-3B-Instruct**: Used in Visual agent for multimodal image-text understanding |
| 72 | +- **Llama-3.2-3B-Instruct**: For Ecology agents for text understanding and generation |
| 73 | +- **DeepForest Models**: For tree, bird, and livestock detection. Also used for alive/dead tree classification. |
| 74 | + |
| 75 | + |
| 76 | +## Multi-Agent Workflow |
| 77 | + |
| 78 | +[](https://mermaid.live/edit#pako:eNplVV1z2jAQ_CsaPfQJMg4fofDQjmNDSgKBxIE2EXlQbRU0sSWPJDelJP-9Z1kmQGHGY-Pdvbu9O7HDsUwYHuC1ovkGPYQrgeDjk3FG1wwt8lTS5Bk1m1_QJYmY1lwKFChGDdw8V-BL-zpwlCVNeWJfo08oMlLBjw4YWGBIpiyTaov8NRMG-YKmW801aqJoOpugybTt4NU1tKThzpG-UQ0U_crU1_cKMCwBb49Mv6ERGTETb1BA4w1L0HU0u3VaI6tyRYaxTOW6jt1Ek5RmFLXPWqgNZaBXbhzb8a4s7xuJDNScoRGHbNE907kUmh3lWaVxK9_QmIyKNHUh5jxnKRc1dmz1rkmZGppSAeYoNBbccLDtb426tqibSqey9a4AgNnu7XLIG4ucEEhJpoV1Pdiw-MW9ntjXU9eaBw6ZrKHMOS1diiAgOvc8D_2SCi25hhCn-tV1amVuyQlm7-LdKxNoOSmVlYxhStAQPCzj1RXdWoUZ8ZPE_qyRkbY_UHysWAY6NE23R1FnljPfHRvwCR3UOrupp2D-MQV3JGQsH0nFtEEhMyyGKdznejJk833b7kn0wnP0H_dj0KvrnU0rIldMMEUN2CplCkOTprqan3tGtRTgtKNFlvBAhn9YXAD-IMJRKzqe5xgPlrEgdhI_0igrL1JT92ZhUUsScm0U_1lJO6j11xm9oQa9wtrkSiZFDHuRFKqcg7qbpTFOcmklv9s2_Re2btmRG_eW8YNM5BpZ_6rq6xx_VILVw3f78GixgcxyxTZMaP6bHco-WtDTx6aeDtvhyh6l8mSZvk-irTAbpktPfVigkBrqgL5fHWWX9ULXqwxNfIEZ_rmtbuqTzR1tgTXkePlP3Ajc4Ra6ppX1peDgISZ0Z9mQLPKknJzF2OKkgMr0M27AKcwTPPhFU80aOGMqo-Uz3pX8FYaiMrbCA7gVrDCKpiu8Eu_Ay6l4kjLDA6MKYCpZrDf1Q2FjhZzCEZ_txRUTCVOBLITBg-75ec-K4MEO_8GDTqt_5nUuWv32xXn_on_RaeAtHjTbvdZZv-X14Nvq9nodr_vewH9t3NZZt9Ntd7q9_mev3-94rQZmCYe9m1Z_L_Zf5v0fgSP6Cg) |
0 commit comments