You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/grt/index.md
+42-35Lines changed: 42 additions & 35 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,47 +10,47 @@ The GRT reference implementation uses a [hydra](https://hydra.cc/docs/intro/) +
10
10
2. Create a virtual environment in `nrdk/grt` with `uv sync`.
11
11
3. Run with `uv run train.py`; see the hydra config files in `nrdk/grt/config/` for options.
12
12
13
-
!!! info "Pre-Trained Checkpoints"
13
+
## Pre-Trained Checkpoints
14
14
15
-
Pre-trained model checkpoints for the GRT reference implementation on the [I/Q-1M dataset](https://radarml.github.io/red-rover/iq1m/) can also be found [here](https://radarml.github.io/red-rover/iq1m/osn/#download-from-osn).
15
+
Pre-trained model checkpoints for the GRT reference implementation on the [I/Q-1M dataset](https://radarml.github.io/red-rover/iq1m/) can also be found [here](https://radarml.github.io/red-rover/iq1m/osn/#download-from-osn).
16
16
17
-
With a single GPU, these checkpoints can be reproduced with the following:
17
+
With a single GPU, these checkpoints can be reproduced with the following:
18
18
19
-
=== "Base Model"
20
-
The 3D polar occupancy base model is provided as the default configuration (i.e., `sensors=[radar,lidar]`, `transforms@transforms.sample=[radar,lidar3d]`, `objective=lidar3d`, `model/decoder=lidar3d`).
21
-
```sh
22
-
uv run train.py meta.name=base meta.version=small size=small
23
-
```
19
+
=== "Base Model"
20
+
The 3D polar occupancy base model is provided as the default configuration (i.e., `sensors=[radar,lidar]`, `transforms@transforms.sample=[radar,lidar3d]`, `objective=lidar3d`, `model/decoder=lidar3d`).
21
+
```sh
22
+
uv run train.py meta.name=base meta.version=small size=small
23
+
```
24
24
25
-
=== "2D Occupancy"
26
-
```sh
27
-
uv run train.py meta.name=occ2d meta.version=small size=small \
28
-
+base=occ3d_to_occ2d \
29
-
sensors=[radar,lidar] \
30
-
transforms@transforms.sample=[radar,lidar2d] \
31
-
objective=lidar2d \
32
-
model/decoder=lidar2d
33
-
```
25
+
=== "2D Occupancy"
26
+
```sh
27
+
uv run train.py meta.name=occ2d meta.version=small size=small \
28
+
+base=occ3d_to_occ2d \
29
+
sensors=[radar,lidar]\
30
+
transforms@transforms.sample=[radar,lidar2d]\
31
+
objective=lidar2d \
32
+
model/decoder=lidar2d
33
+
```
34
34
35
-
=== "Semantic Segmentation"
36
-
```sh
37
-
uv run train.py meta.name=semseg meta.version=small size=small \
38
-
+base=occ3d_to_semseg \
39
-
sensors=[radar,semseg] \
40
-
transforms@transforms.sample=[radar,semseg] \
41
-
objective=semseg \
42
-
model/decoder=semseg
43
-
```
35
+
=== "Semantic Segmentation"
36
+
```sh
37
+
uv run train.py meta.name=semseg meta.version=small size=small \
38
+
+base=occ3d_to_semseg \
39
+
sensors=[radar,semseg]\
40
+
transforms@transforms.sample=[radar,semseg]\
41
+
objective=semseg \
42
+
model/decoder=semseg
43
+
```
44
44
45
-
=== "Ego-Motion"
46
-
```sh
47
-
uv run train.py meta.name=vel meta.version=small size=small \
48
-
+base=occ3d_to_vel \
49
-
sensors=[radar,pose]
50
-
transforms@transforms.sample=[radar,vel] \
51
-
objective=vel \
52
-
model/decoder=vel
53
-
```
45
+
=== "Ego-Motion"
46
+
```sh
47
+
uv run train.py meta.name=vel meta.version=small size=small \
48
+
+base=occ3d_to_vel \
49
+
sensors=[radar,pose]\
50
+
transforms@transforms.sample=[radar,vel]\
51
+
objective=vel \
52
+
model/decoder=vel
53
+
```
54
54
55
55
!!! tip
56
56
@@ -119,6 +119,13 @@ The GRT template includes reference training scripts which can be used for high
119
119
--8<-- "grt/train_minimal.py"
120
120
```
121
121
122
+
!!! info "Compile the Model"
123
+
124
+
You can invoke the pytorch [JIT Compiler](https://docs.pytorch.org/tutorials/intermediate/torch_compile_tutorial.html) by setting `meta.compile=True`. Since the pytorch compiler is kind of janky and causes issues with type checkers, you will also need to set
The NRDK library also includes the following extras:
80
83
81
84
- `nrdk[roverd]`: support for loading and processing data using the [roverd](https://radarml.github.io/red-rover/roverd/) format (i.e., as collected by the [`red-rover`](https://radarml.github.io/red-rover/) system).
82
-
- `nrdk[grt]`: extra dependencies for our GRT reference implementation, which uses a hydra-based configuration system.
83
85
- `nrdk[docs]`: a mkdocs + mkdocs-material + mdocstrings-python documentation stack.
84
86
- `nrdk[dev]`: linters, testing, pre-commit hooks, etc.
85
87
@@ -123,6 +125,12 @@ uv run pre-commit run --all-files
123
125
124
126
abstract interface for composable dataloaders and preprocessing pipelines
0 commit comments