@@ -29,7 +29,7 @@ My platform is like this:
2929## get start
3030With a pretrained weight, you can run inference on an single image like this:
3131```
32- $ python tools/demo.py --model bisenetv2 --weight-path /path/to/your/weights.pth --img-path ./example.png
32+ $ python tools/demo.py --config configs/bisenetv2_city.py --weight-path /path/to/your/weights.pth --img-path ./example.png
3333```
3434This would run inference on the image and save the result image to ` ./res.jpg ` .
3535
@@ -64,10 +64,10 @@ In order to train the model, you can run command like this:
6464$ export CUDA_VISIBLE_DEVICES=0,1
6565
6666# if you want to train with apex
67- $ python -m torch.distributed.launch --nproc_per_node=2 tools/train.py --model bisenetv2 # or bisenetv1
67+ $ python -m torch.distributed.launch --nproc_per_node=2 tools/train.py --config configs/bisenetv2_city.py # or bisenetv1
6868
6969# if you want to train with pytorch fp16 feature from torch 1.6
70- $ python -m torch.distributed.launch --nproc_per_node=2 tools/train_amp.py --model bisenetv2 # or bisenetv1
70+ $ python -m torch.distributed.launch --nproc_per_node=2 tools/train_amp.py --config configs/bisenetv2_city.py # or bisenetv1
7171```
7272
7373Note that though ` bisenetv2 ` has fewer flops, it requires much more training iterations. The the training time of ` bisenetv1 ` is shorter.
@@ -77,17 +77,17 @@ Note that though `bisenetv2` has fewer flops, it requires much more training ite
7777You can also load the trained model weights and finetune from it:
7878```
7979$ export CUDA_VISIBLE_DEVICES=0,1
80- $ python -m torch.distributed.launch --nproc_per_node=2 tools/train.py --finetune-from ./res/model_final.pth --model bisenetv2 # or bisenetv1
80+ $ python -m torch.distributed.launch --nproc_per_node=2 tools/train.py --finetune-from ./res/model_final.pth --config ./configs/bisenetv2_city.py # or bisenetv1
8181
8282# same with pytorch fp16 feature
83- $ python -m torch.distributed.launch --nproc_per_node=2 tools/train_amp.py --finetune-from ./res/model_final.pth --model bisenetv2 # or bisenetv1
83+ $ python -m torch.distributed.launch --nproc_per_node=2 tools/train_amp.py --finetune-from ./res/model_final.pth --config ./configs/bisenetv2_city.py # or bisenetv1
8484```
8585
8686
8787## eval pretrained models
8888You can also evaluate a trained model like this:
8989```
90- $ python tools/evaluate.py --model bisenetv1 --weight-path /path/to/your/weight.pth
90+ $ python tools/evaluate.py --config configs/bisenetv1_city.py --weight-path /path/to/your/weight.pth
9191```
9292
9393## Infer with tensorrt
0 commit comments