multi-gpu inference. Adds 'batch index' to the resulting prediction#854
Merged
skothenhill-nv merged 2 commits intomainfrom May 1, 2025
Merged
multi-gpu inference. Adds 'batch index' to the resulting prediction#854skothenhill-nv merged 2 commits intomainfrom
skothenhill-nv merged 2 commits intomainfrom
Conversation
dictionary. this allows users to reconstruct the original ordering of predictions with multi-gpu inference. Signed-off-by: Steven <skothenhill@nvidia.com>
polinabinder1
approved these changes
Apr 30, 2025
farhadrgh
approved these changes
Apr 30, 2025
Codecov ReportAttention: Patch coverage is
✅ All tests successful. No failed tests found.
Additional details and impacted files@@ Coverage Diff @@
## main #854 +/- ##
==========================================
- Coverage 84.40% 84.40% -0.01%
==========================================
Files 138 138
Lines 8685 8690 +5
==========================================
+ Hits 7331 7335 +4
- Misses 1354 1355 +1
|
…e with batch_collator Signed-off-by: Steven <skothenhill@nvidia.com>
cspades
pushed a commit
that referenced
this pull request
May 4, 2025
…854) ### Description Adds 'batch index' to the resulting prediction dictionary. this allows users to reconstruct the original ordering of predictions with multi-gpu inference. Prevents users from using 'epoch' mode of inference with multiple GPUs. this addresses a known issue: https://nvbugswb.nvidia.com/NvBugs5/SWBug.aspx?bugid=4717442&cmtNo= ### Type of changes - [x] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Refactor - [ ] Documentation update - [ ] Other (please describe): ### Pre-submit Checklist <!--- Ensure all items are completed before submitting --> - [x ] I have tested these changes locally - [x] I have updated the documentation accordingly - [x] All existing tests pass successfully --------- Signed-off-by: Steven <skothenhill@nvidia.com>
cspades
pushed a commit
that referenced
this pull request
May 4, 2025
…854) ### Description Adds 'batch index' to the resulting prediction dictionary. this allows users to reconstruct the original ordering of predictions with multi-gpu inference. Prevents users from using 'epoch' mode of inference with multiple GPUs. this addresses a known issue: https://nvbugswb.nvidia.com/NvBugs5/SWBug.aspx?bugid=4717442&cmtNo= ### Type of changes - [x] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Refactor - [ ] Documentation update - [ ] Other (please describe): ### Pre-submit Checklist <!--- Ensure all items are completed before submitting --> - [x ] I have tested these changes locally - [x] I have updated the documentation accordingly - [x] All existing tests pass successfully --------- Signed-off-by: Steven <skothenhill@nvidia.com> Signed-off-by: Cory Ye <cye@nvidia.com>
trvachov
pushed a commit
that referenced
this pull request
May 16, 2025
…854) ### Description Adds 'batch index' to the resulting prediction dictionary. this allows users to reconstruct the original ordering of predictions with multi-gpu inference. Prevents users from using 'epoch' mode of inference with multiple GPUs. this addresses a known issue: https://nvbugswb.nvidia.com/NvBugs5/SWBug.aspx?bugid=4717442&cmtNo= ### Type of changes - [x] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Refactor - [ ] Documentation update - [ ] Other (please describe): ### Pre-submit Checklist <!--- Ensure all items are completed before submitting --> - [x ] I have tested these changes locally - [x] I have updated the documentation accordingly - [x] All existing tests pass successfully --------- Signed-off-by: Steven <skothenhill@nvidia.com>
camirr-nv
pushed a commit
that referenced
this pull request
Jun 26, 2025
…854) ### Description Adds 'batch index' to the resulting prediction dictionary. this allows users to reconstruct the original ordering of predictions with multi-gpu inference. Prevents users from using 'epoch' mode of inference with multiple GPUs. this addresses a known issue: https://nvbugswb.nvidia.com/NvBugs5/SWBug.aspx?bugid=4717442&cmtNo= ### Type of changes - [x] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Refactor - [ ] Documentation update - [ ] Other (please describe): ### Pre-submit Checklist <!--- Ensure all items are completed before submitting --> - [x ] I have tested these changes locally - [x] I have updated the documentation accordingly - [x] All existing tests pass successfully --------- Signed-off-by: Steven <skothenhill@nvidia.com> Signed-off-by: Ubuntu <camirr@nvidia.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Adds 'batch index' to the resulting prediction dictionary. this allows users to reconstruct the original ordering of predictions with multi-gpu inference. Prevents users from using 'epoch' mode of inference with multiple GPUs. this addresses a known issue: https://nvbugswb.nvidia.com/NvBugs5/SWBug.aspx?bugid=4717442&cmtNo=
Type of changes
Pre-submit Checklist