You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[Step 4: Install 3D Processing and ML Dependencies](#step-4-install-3d-processing-and-ml-dependencies)
55
-
-[Notes on Usage](#notes-on-usage)
56
48
-[Future Research Directions](#future-research-directions)
57
49
-[Citation](#citation)
58
50
-[External Research Collaborators](#external-research-collaborators)
@@ -64,32 +56,36 @@ _Last updated: December 18, 2025_
64
56
65
57
## Abstract
66
58
67
-
Semantic segmentation of unstructured 3D point clouds remains a challenging problem, particularly in domains where appearance cues are unavailable. In plant phenotyping, LiDAR-based point clouds provide rich geometric information but suffer from class imbalance, occlusion, and geometric ambiguity between biological and abiotic structures.
68
-
69
-
This work investigates the use of **geometry-aware deep learning** for organ-level plant segmentation using **Dynamic Edge Convolutional Neural Networks (DECNNs)**. We propose a structured annotation protocol, geometric feature augmentation, and a loss formulation tailored to highly imbalanced plant data. The released code focuses on methodology and reproducibility and accompanies an ongoing research manuscript.
70
-
71
-
Our goal with this applied research project is to demonstrate how computer vision tools have a direct impact on industry problems such as 3D perception, robotics, autonomous systems, medical imaging, and industrial and agricultural inspection.
59
+
<div style="
60
+
max-width: 1100px;
61
+
margin: auto;
62
+
text-align: justify;
63
+
text-justify: inter-word;
64
+
line-height: 1.6;
65
+
">
66
+
Semantic segmentation of unstructured 3D point clouds remains a challenging problem, particularly in domains where appearance cues are unavailable. In plant phenotyping, LiDAR-based point clouds provide rich geometric information but suffer from class imbalance, occlusion, and geometric ambiguity between biological and abiotic structures. This work investigates the use of geometry-aware deep learning for organ-level plant segmentation using Dynamic Edge Convolutional Neural Networks (DECNNs). We propose a structured annotation protocol, geometric feature augmentation, and a loss formulation tailored to highly imbalanced plant data. The released code focuses on methodology and reproducibility and accompanies an ongoing research manuscript. Our goal with this applied research project is to demonstrate how computer vision tools have a direct impact on industry problems such as 3D perception, robotics, autonomous systems, medical imaging, and industrial and agricultural inspection.
67
+
</div>
72
68
73
69
---
74
70
75
-
## Contributions
76
-
77
-
- Geometry-based semantic segmentation of plant organs from 3D LiDAR point clouds
78
-
- A structured manual annotation protocol tailored to complex biological structures
79
-
- Integration of local geometric descriptors within dynamic graph convolutional networks
80
-
- Robust learning under severe class imbalance in organ-level segmentation tasks
81
-
- A research-oriented, modular codebase designed to support reproducible experimentation
82
-
83
-
---
84
-
85
71
## Repository Purpose
86
72
87
73
This repository releases **research code** developed in collaboration with **Oak Ridge National Laboratory** for studying semantic segmentation of 3D plant point clouds.
88
74
89
75
**Important:**
90
-
- No raw or processed data is included
91
-
- No trained model checkpoints are provided
92
-
- The repository focuses on **methodology, architecture, and evaluation**
76
+
- No raw or processed data is included.
77
+
- No trained model checkpoints are provided.
78
+
- The repository focuses on **methodology, architecture, and evaluation**.
79
+
80
+
---
81
+
82
+
## Contributions
83
+
84
+
- Geometry-based semantic segmentation of plant organs from 3D LiDAR point clouds.
85
+
- A structured manual annotation protocol tailored to complex biological structures.
86
+
- Integration of local geometric descriptors within dynamic graph convolutional networks.
87
+
- Robust learning under severe class imbalance in organ-level segmentation tasks.
88
+
- A research-oriented, modular codebase designed to support reproducible experimentation.
93
89
94
90
---
95
91
@@ -102,6 +98,36 @@ train with `python train.py`, then evaluate with `python evaluation.py` and visu
102
98
103
99
---
104
100
101
+
## Code Structure
102
+
103
+
The repository is organized as a modular research pipeline:
104
+
105
+
```text
106
+
├── data/ # Directory structure only (no data included)
107
+
│ ├── train/
108
+
│ ├── val/
109
+
│ └── test/
110
+
│
111
+
├── src/
112
+
│ ├── dataset.py # ETL and geometric feature computation
113
+
│ ├── model.py # Dynamic Edge CNN architecture
114
+
│ └── inference.py # Inference and post-processing
115
+
│
116
+
├── validations/ # Data integrity and sanity checks
117
+
│ ├── check_data.py
118
+
│ ├── check_labels.py
119
+
│ └── count_nans.py
120
+
│
121
+
├── train.py # Training loop
122
+
├── evaluation.py # Metric computation and bootstrapping
123
+
├── visualization.py # 3D visualization utilities
124
+
└── README.md
125
+
```
126
+
127
+
All directories related to raw data, predictions, and model checkpoints are **intentionally excluded** from this repository to comply with data confidentiality and intellectual property constraints associated with Oak Ridge National Laboratory (ORNL).
128
+
129
+
---
130
+
105
131
## Method Overview
106
132
107
133
### Problem Setting
@@ -114,10 +140,10 @@ Given a 3D LiDAR point cloud acquired in a controlled phenotyping environment, t
114
140
-**Background**
115
141
116
142
The task is challenging due to:
117
-
- Severe class imbalance
118
-
- Structural similarity between stems and stakes
119
-
- Occlusion and sparse sampling
120
-
- Absence of RGB or spectral information
143
+
- Severe class imbalance.
144
+
- Structural similarity between stems and stakes.
145
+
- Occlusion and sparse sampling.
146
+
- Absence of RGB or spectral information.
121
147
122
148
---
123
149
@@ -128,16 +154,14 @@ High-quality ground truth is critical for supervised semantic segmentation of 3D
128
154
The protocol defines a rule-based workflow for point-wise segmentation and labeling of LiDAR point clouds into four semantic classes: **stem**, **leaf**, **stake**, and **background**. It enforces strict completeness, naming conventions, boundary rules, and class assignment guidelines, ensuring that every point in the original scan is assigned a biologically meaningful label.
129
155
130
156
This annotation strategy was essential for:
131
-
- Producing reliable supervision signals for deep learning
132
-
- Reducing label noise in geometrically ambiguous regions
133
-
- Enabling consistent evaluation across samples
134
-
- Supporting reproducibility and future dataset extensions
157
+
- Producing reliable supervision signals for deep learning.
158
+
- Reducing label noise in geometrically ambiguous regions.
159
+
- Enabling consistent evaluation across samples.
160
+
- Supporting reproducibility and future dataset extensions.
135
161
136
162
A total of 30 fully annotated 3D LiDAR point clouds were generated and used for supervised training and evaluation.
137
163
138
-
The full annotation procedure, including setup instructions, segmentation steps, labeling rules, and export formats, is documented in detail here:
The full annotation procedure, including setup instructions, segmentation steps, labeling rules, and export formats, is documented in detail here: [Manual Annotation Protocol](assets/docs/3D_Plant_Segmentation_Protocol.pdf)
141
165
142
166
---
143
167
@@ -158,9 +182,9 @@ These features encode local shape properties critical for organ discrimination.
158
182
159
183
The core model is a **Dynamic Edge Convolutional Neural Network (DECNN)** that:
160
184
161
-
- Dynamically constructs neighborhood graphs per layer
162
-
- Learns edge features capturing local geometry
163
-
- Operates directly on unstructured point clouds
185
+
- Dynamically constructs neighborhood graphs per layer.
186
+
- Learns edge features capturing local geometry.
187
+
- Operates directly on unstructured point clouds.
164
188
165
189
To address extreme class imbalance, training uses a composite loss combining:
166
190
@@ -173,10 +197,10 @@ To address extreme class imbalance, training uses a composite loss combining:
173
197
174
198
Model performance is evaluated using:
175
199
176
-
- Intersection over Union (IoU)
177
-
- Precision and Recall (per class)
178
-
- Sample-averaged metrics
179
-
- Bootstrapped confidence intervals
200
+
- Intersection over Union (IoU).
201
+
- Precision and Recall (per class).
202
+
- Sample-averaged metrics.
203
+
- Bootstrapped confidence intervals.
180
204
181
205
Qualitative evaluation is performed via 3D visualization of predicted segmentations.
182
206
@@ -186,51 +210,18 @@ Qualitative evaluation is performed via 3D visualization of predicted segmentati
186
210
187
211
Under the described experimental setup:
188
212
189
-
- Strong performance is observed on the dominant **Leaf** class
190
-
- High recall is achieved for the biologically critical **Stem** class
191
-
- Qualitative results show coherent reconstruction of plant structure
213
+
- Strong performance is observed on the dominant **Leaf** class.
214
+
- High recall is achieved for the biologically critical **Stem** class.
215
+
- Qualitative results show coherent reconstruction of plant structure.
192
216
193
217
Limitations include stem–stake ambiguity and boundary artifacts due to resolution constraints.
194
218
195
-
For a comprehensive discussion of experimental results, quantitative metrics, and additional analyses, please refer to the full exit report available here:
The repository is organized as a modular research pipeline:
204
-
205
-
```text
206
-
├── data/ # Directory structure only (no data included)
207
-
│ ├── train/
208
-
│ ├── val/
209
-
│ └── test/
210
-
│
211
-
├── src/
212
-
│ ├── dataset.py # ETL and geometric feature computation
213
-
│ ├── model.py # Dynamic Edge CNN architecture
214
-
│ └── inference.py # Inference and post-processing
215
-
│
216
-
├── validations/ # Data integrity and sanity checks
217
-
│ ├── check_data.py
218
-
│ ├── check_labels.py
219
-
│ └── count_nans.py
220
-
│
221
-
├── train.py # Training loop
222
-
├── evaluation.py # Metric computation and bootstrapping
223
-
├── visualization.py # 3D visualization utilities
224
-
└── README.md
225
-
```
226
-
227
-
All directories related to raw data, predictions, and model checkpoints are **intentionally excluded** from this repository to comply with data confidentiality and intellectual property constraints associated with Oak Ridge National Laboratory (ORNL).
219
+
For a comprehensive discussion of experimental results, quantitative metrics, and additional analyses, please refer to the full exit report available here: [Exit Report](assests/docs/ORNL_DECNN_Exit_Report.pdf)
228
220
229
221
---
230
222
231
223
232
-
233
-
### Future Research Directions
224
+
## Future Research Directions
234
225
235
226
Potential extensions of this work include:
236
227
@@ -241,7 +232,7 @@ Potential extensions of this work include:
241
232
242
233
---
243
234
244
-
###Citation
235
+
## Citation
245
236
246
237
If you find this work useful in your research, please consider citing:
247
238
@@ -253,7 +244,7 @@ If you find this work useful in your research, please consider citing:
253
244
}
254
245
```
255
246
---
256
-
###External Research Collaborators
247
+
## External Research Collaborators
257
248
**Oak Ridge National Laboratory (ORNL) | Biosciences Division**
258
249
259
250
**Dr. John Lagergren**
@@ -287,13 +278,13 @@ joynerm@etsu.edu
287
278
288
279
---
289
280
290
-
###Acknowledgments
281
+
## Acknowledgments
291
282
292
283
This research used resources of the Advanced Plant Phenotyping Laboratory and the Center for Bioenergy Innovation (CBI), which is a U.S. Department of Energy Bioenergy Research Center supported by the Office of Biological and Environmental Research in the DOE Office of Science. Oak Ridge National Laboratory is managed by UT-Battelle, LLC for the U.S. Department of Energy under Contract Number DE-AC05-00OR22725.
293
284
294
285
We sincerely thank **Dr. John Lagergren**, **Dr. Larry M. York**, and **Anand Seethepalli** (Oak Ridge National Laboratory, Biosciences Division) for providing access to experimental data, domain expertise, and valuable feedback throughout the project. We also thank **Dr. Jeff R. Knisley**, **Dr. Robert M. Price**, and **Dr. Michele Joyner** (Department of Mathematics & Statistics, East Tennessee State University) for their academic guidance and mentorship, and for making this collaboration possible by enabling meaningful real-world research and development experience in data science.
295
286
296
287
---
297
-
###Disclaimer
288
+
## Disclaimer
298
289
299
290
The views and conclusions expressed in this repository are those of the authors and do not necessarily represent the views of Oak Ridge National Laboratory or the U.S. Department of Energy. The code is provided for **academic and research purposes only**.
0 commit comments