-
Prerequisite
💬 Describe the reimplementation questionsWhen I use dist_train.sh for multi-gpu training on one computer using commands:
Is this forced assignment necessary? How can I use my GPU2 and GPU3 for the 2-gpu training? Environmentmmdet==2.25.0 mmcv==1.6.0 python==3.7.10 pytorch==1.8.1 Expected resultsNo response Additional informationNo response |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
You should set CUDA_VISIBLE_DEVICES=2,3,4 when running the dist_train.sh |
Beta Was this translation helpful? Give feedback.
-
您好,您的邮件已经收到。我会尽快查收的~
|
Beta Was this translation helpful? Give feedback.
You should set CUDA_VISIBLE_DEVICES=2,3,4 when running the dist_train.sh