Skip to content

Commit 3f5146d

Browse files
committed
Adding DeepForest multi agent with HuggingFace models implementation
1 parent dbd566e commit 3f5146d

28 files changed

+7177
-1
lines changed

.gitignore

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,3 +16,16 @@ eggs/
1616
.installed.cfg
1717
*.egg
1818
MANIFEST
19+
20+
21+
# Environment
22+
.env
23+
24+
25+
# Testing
26+
.pytest_cache/
27+
28+
29+
# Project specific
30+
lightning_logs/
31+
.gradio/

README.md

Lines changed: 78 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,78 @@
1-
# deepforest-agent
1+
# DeepForest Multi-Agent System
2+
3+
The DeepForest Multi-Agent System provides ecological image analysis by orchestrating multiple AI agents that work together to understand ecological images. Simply upload an image of a forest, wildlife habitat, or ecological scene, and ask questions in natural language.
4+
5+
## Installation
6+
7+
### 1. Clone the repository
8+
9+
```bash
10+
git clone https://github.com/weecology/deepforest-agent.git
11+
cd deepforest-agent
12+
```
13+
14+
### 2. Create and activate a Conda environment
15+
16+
```bash
17+
conda create -n deepforest_agent python=3.12.11
18+
conda activate deepforest_agent
19+
```
20+
21+
### 3. Install dependencies
22+
23+
```bash
24+
pip install -r requirements.txt
25+
pip install -e .
26+
```
27+
28+
### 4. Configure the HuggingFace Token
29+
Create a `.env` file in the root directory of the deepforest-agent project and add your HuggingFace token like below:
30+
31+
```bash
32+
HF_TOKEN="your_huggingface_token_here"
33+
```
34+
35+
You can obtain your token from [HuggingFace Access Token](https://huggingface.co/settings/tokens). Make sure the Token type is "Write".
36+
37+
## Usage
38+
39+
The DeepForest Agent runs through a Gradio web interface. To start the interface, execute:
40+
41+
```bash
42+
python -m deepforest_agent.app
43+
```
44+
45+
A link like http://127.0.0.1:7860 will appear in the terminal. Open it in your browser to interact with the agent. A public Gradio link may also be provided if available.
46+
47+
48+
## Features
49+
50+
- **Multi-Species Detection**: Automatically detects trees, birds, and livestock using specialized DeepForest models
51+
- **Tree Health Assessment**: Identifies alive and dead trees using DeepForest Tree Detector whenever user asks.
52+
- **Visual Analysis**: Dual analysis of original and annotated images using Qwen2.5-VL-3B-Instruct model
53+
- **Memory Context**: Maintains conversation history for contextual understanding across multiple queries
54+
- **Ecological Insights**: Synthesizes detection data with visual analysis for comprehensive ecological understanding
55+
- **Streaming Responses**: Real-time updates as each agent processes your query
56+
57+
58+
## Requirements
59+
60+
### Hardware Requirements
61+
- **GPU**: GPU with at least 24GB VRAM (recommended for optimal performance)
62+
- **Storage**: At least 16GB free space for model downloads
63+
64+
### API Requirements
65+
- **HuggingFace Token**: Required for model access.
66+
67+
68+
## Models Used
69+
70+
- **SmolLM3-3B**: For Memory Agent to get context, and for Detector Agent to call the tool with appropriate parameters
71+
- **Qwen2.5-VL-3B-Instruct**: Used in Visual agent for multimodal image-text understanding
72+
- **Llama-3.2-3B-Instruct**: For Ecology agents for text understanding and generation
73+
- **DeepForest Models**: For tree, bird, and livestock detection. Also used for alive/dead tree classification.
74+
75+
76+
## Multi-Agent Workflow
77+
78+
[![](https://mermaid.ink/img/pako:eNplVV1z2jAQ_CsaPfQJMg4fofDQjmNDSgKBxIE2EXlQbRU0sSWPJDelJP-9Z1kmQGHGY-Pdvbu9O7HDsUwYHuC1ovkGPYQrgeDjk3FG1wwt8lTS5Bk1m1_QJYmY1lwKFChGDdw8V-BL-zpwlCVNeWJfo08oMlLBjw4YWGBIpiyTaov8NRMG-YKmW801aqJoOpugybTt4NU1tKThzpG-UQ0U_crU1_cKMCwBb49Mv6ERGTETb1BA4w1L0HU0u3VaI6tyRYaxTOW6jt1Ek5RmFLXPWqgNZaBXbhzb8a4s7xuJDNScoRGHbNE907kUmh3lWaVxK9_QmIyKNHUh5jxnKRc1dmz1rkmZGppSAeYoNBbccLDtb426tqibSqey9a4AgNnu7XLIG4ucEEhJpoV1Pdiw-MW9ntjXU9eaBw6ZrKHMOS1diiAgOvc8D_2SCi25hhCn-tV1amVuyQlm7-LdKxNoOSmVlYxhStAQPCzj1RXdWoUZ8ZPE_qyRkbY_UHysWAY6NE23R1FnljPfHRvwCR3UOrupp2D-MQV3JGQsH0nFtEEhMyyGKdznejJk833b7kn0wnP0H_dj0KvrnU0rIldMMEUN2CplCkOTprqan3tGtRTgtKNFlvBAhn9YXAD-IMJRKzqe5xgPlrEgdhI_0igrL1JT92ZhUUsScm0U_1lJO6j11xm9oQa9wtrkSiZFDHuRFKqcg7qbpTFOcmklv9s2_Re2btmRG_eW8YNM5BpZ_6rq6xx_VILVw3f78GixgcxyxTZMaP6bHco-WtDTx6aeDtvhyh6l8mSZvk-irTAbpktPfVigkBrqgL5fHWWX9ULXqwxNfIEZ_rmtbuqTzR1tgTXkePlP3Ajc4Ra6ppX1peDgISZ0Z9mQLPKknJzF2OKkgMr0M27AKcwTPPhFU80aOGMqo-Uz3pX8FYaiMrbCA7gVrDCKpiu8Eu_Ay6l4kjLDA6MKYCpZrDf1Q2FjhZzCEZ_txRUTCVOBLITBg-75ec-K4MEO_8GDTqt_5nUuWv32xXn_on_RaeAtHjTbvdZZv-X14Nvq9nodr_vewH9t3NZZt9Ntd7q9_mev3-94rQZmCYe9m1Z_L_Zf5v0fgSP6Cg?type=png)](https://mermaid.live/edit#pako:eNplVV1z2jAQ_CsaPfQJMg4fofDQjmNDSgKBxIE2EXlQbRU0sSWPJDelJP-9Z1kmQGHGY-Pdvbu9O7HDsUwYHuC1ovkGPYQrgeDjk3FG1wwt8lTS5Bk1m1_QJYmY1lwKFChGDdw8V-BL-zpwlCVNeWJfo08oMlLBjw4YWGBIpiyTaov8NRMG-YKmW801aqJoOpugybTt4NU1tKThzpG-UQ0U_crU1_cKMCwBb49Mv6ERGTETb1BA4w1L0HU0u3VaI6tyRYaxTOW6jt1Ek5RmFLXPWqgNZaBXbhzb8a4s7xuJDNScoRGHbNE907kUmh3lWaVxK9_QmIyKNHUh5jxnKRc1dmz1rkmZGppSAeYoNBbccLDtb426tqibSqey9a4AgNnu7XLIG4ucEEhJpoV1Pdiw-MW9ntjXU9eaBw6ZrKHMOS1diiAgOvc8D_2SCi25hhCn-tV1amVuyQlm7-LdKxNoOSmVlYxhStAQPCzj1RXdWoUZ8ZPE_qyRkbY_UHysWAY6NE23R1FnljPfHRvwCR3UOrupp2D-MQV3JGQsH0nFtEEhMyyGKdznejJk833b7kn0wnP0H_dj0KvrnU0rIldMMEUN2CplCkOTprqan3tGtRTgtKNFlvBAhn9YXAD-IMJRKzqe5xgPlrEgdhI_0igrL1JT92ZhUUsScm0U_1lJO6j11xm9oQa9wtrkSiZFDHuRFKqcg7qbpTFOcmklv9s2_Re2btmRG_eW8YNM5BpZ_6rq6xx_VILVw3f78GixgcxyxTZMaP6bHco-WtDTx6aeDtvhyh6l8mSZvk-irTAbpktPfVigkBrqgL5fHWWX9ULXqwxNfIEZ_rmtbuqTzR1tgTXkePlP3Ajc4Ra6ppX1peDgISZ0Z9mQLPKknJzF2OKkgMr0M27AKcwTPPhFU80aOGMqo-Uz3pX8FYaiMrbCA7gVrDCKpiu8Eu_Ay6l4kjLDA6MKYCpZrDf1Q2FjhZzCEZ_txRUTCVOBLITBg-75ec-K4MEO_8GDTqt_5nUuWv32xXn_on_RaeAtHjTbvdZZv-X14Nvq9nodr_vewH9t3NZZt9Ntd7q9_mev3-94rQZmCYe9m1Z_L_Zf5v0fgSP6Cg)
575 KB
Loading

data/OSBS_029.tif

594 KB
Binary file not shown.

pyproject.toml

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
[project]
2+
name = "deepforest_agent"
3+
version = "0.1.0"
4+
description = "AI Agent for DeepForest object detection"
5+
authors = [
6+
{name = "Your Name", email = "you@example.com"}
7+
]
8+
requires-python = ">=3.12"
9+
readme = "README.md"
10+
dependencies = [
11+
"accelerate",
12+
"albumentations<2.0",
13+
"deepforest",
14+
"fastapi",
15+
"geopandas",
16+
"google-genai",
17+
"google-generativeai",
18+
"gradio",
19+
"gradio-image-annotation",
20+
"langchain",
21+
"langchain-community",
22+
"langchain-google-genai",
23+
"langchain-huggingface",
24+
"langgraph",
25+
"matplotlib",
26+
"numpy",
27+
"num2words",
28+
"openai",
29+
"opencv-python",
30+
"outlines",
31+
"pandas",
32+
"pillow",
33+
"plotly",
34+
"pydantic",
35+
"pydantic-settings",
36+
"pytest",
37+
"pytest-cov",
38+
"python-dotenv",
39+
"pyyaml",
40+
"qwen-vl-utils",
41+
"rasterio",
42+
"requests",
43+
"scikit-image",
44+
"seaborn",
45+
"shapely",
46+
"streamlit",
47+
"torch",
48+
"torchvision",
49+
"tqdm",
50+
"transformers",
51+
"bitsandbytes",
52+
]
53+
54+
[project.optional-dependencies]
55+
dev = [
56+
"pre-commit",
57+
"pytest",
58+
"pytest-profiling",
59+
"yapf"
60+
]
61+
62+
[build-system]
63+
requires = ["setuptools>=61.0"]
64+
build-backend = "setuptools.build_meta"

requirements.txt

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
accelerate
2+
albumentations<2.0
3+
deepforest
4+
fastapi
5+
geopandas
6+
google-genai
7+
google-generativeai
8+
gradio
9+
gradio-image-annotation
10+
langchain
11+
langchain-community
12+
langchain-google-genai
13+
langchain-huggingface
14+
langgraph
15+
matplotlib
16+
numpy
17+
num2words
18+
openai
19+
opencv-python
20+
outlines
21+
pandas
22+
pillow
23+
plotly
24+
pydantic
25+
pydantic-settings
26+
pytest
27+
pytest-cov
28+
python-dotenv
29+
pyyaml
30+
qwen-vl-utils
31+
rasterio
32+
requests
33+
scikit-image
34+
seaborn
35+
shapely
36+
streamlit
37+
torch
38+
torchvision
39+
tqdm
40+
transformers
41+
bitsandbytes

0 commit comments

Comments
 (0)