Skip to content

Conversation

Copy link

Copilot AI commented Jan 7, 2026

_save_debug_information() was logging "dataloader_train.transform": "None" despite augmentations being applied. The transforms are actually stored in dataloader.generator.transforms, not at the top level.

Changes:

  • Extract and log generator.transforms when present on dataloaders
  • Add exception handling for stringify failures with exception type in error message
  • Refactor to use local variable dl instead of repeated getattr(self, k) calls

Result:

{
  "dataloader_train.generator.transforms": "ComposeTransforms( transforms = [SpatialTransform(...), GaussianNoiseTransform(...), ...])",
  "dataloader_train.transform": "None"
}
Original prompt

This section details on the original issue you should resolve

<issue_title>_save_debug_information() prints "dataloader_train.transform": "None" everytime despite augmentation applied</issue_title>
<issue_description>As in the title, the debug transform info is always set to None.
I modified the debug function to print the actual augmentations applied. Figured it would be useful to someone else, especially when you're playing around with the augmentations its useful to log them:

Add one extra check in _save_debug_information() to include the internal data loader’s transforms:

if k in ['dataloader_train', 'dataloader_val']:
if hasattr(getattr(self, k), 'generator'):
dct[k + '.generator'] = str(getattr(self, k).generator)
if hasattr(getattr(self, k), 'num_processes'):
dct[k + '.num_processes'] = str(getattr(self, k).num_processes)
if hasattr(getattr(self, k), 'transform'):
dct[k + '.transform'] = str(getattr(self, k).transform)

Suggested edit:

  if k in ['dataloader_train', 'dataloader_val']:
      if hasattr(getattr(self, k), 'generator'):
          dct[k + '.generator'] = str(getattr(self, k).generator)
          if hasattr(dl.generator, 'transforms'):
              try:
                  dct[k + '.generator.transforms'] = str(getattr(self, k).generator.transforms)
              except Exception as e:
                  dct[k + '.generator.transforms'] = f"Could not stringify generator.transforms: {e}"

      if hasattr(getattr(self, k), 'num_processes'):
          dct[k + '.num_processes'] = str(getattr(self, k).num_processes)
      if hasattr(getattr(self, k), 'transform'):
          dct[k + '.transform'] = str(getattr(self, k).transform)

So previously in the debug.json I would get:

"dataloader_train": "<batchgenerators.dataloading.nondet_multi_threaded_augmenter.NonDetMultiThreadedAugmenter object at 0x00000255CFF9C890>",
    "dataloader_train.generator": "<nnunetv2.training.dataloading.data_loader.nnUNetDataLoader object at 0x00000255CDD05DC0>",
    "dataloader_train.num_processes": "12",
    "dataloader_train.transform": "None",

Now I get:

"dataloader_train": "<batchgenerators.dataloading.nondet_multi_threaded_augmenter.NonDetMultiThreadedAugmenter object at 0x000002704922E060>",
    "dataloader_train.generator": "<nnunetv2.training.dataloading.data_loader.nnUNetDataLoader object at 0x000002704946E930>",
    "dataloader_train.generator.transforms": "ComposeTransforms( transforms = [Convert3DTo2DTransform(  ), SpatialTransform( patch_size = [192, 192], patch_center_dist_from_border = [0, 0], random_crop = False, p_elastic_deform = 0, elastic_deform_scale = (0, 0.2), elastic_deform_magnitude = (0, 0.2), p_rotation = 0.2, rotation = (-3.141592653589793, 3.141592653589793), p_scaling = 0.4, scaling = (0.7, 1.8), p_synchronize_scaling_across_axes = 1, p_synchronize_def_scale_across_axes = 0, bg_style_seg_sampling = False, mode_seg = 'bilinear', border_mode_seg = 'zeros', center_deformation = True, padding_mode_image = 'zeros' ), Convert2DTo3DTransform(  ), RandomTransform(p=0.3, transform=GaussianNoiseTransform( noise_variance = (0, 0.2), p_per_channel = 1, synchronize_channels = True )), RandomTransform(p=0.3, transform=GaussianBlurTransform( blur_sigma = (0.5, 1.0), benchmark = True, synchronize_channels = False, synchronize_axes = False, p_per_channel = 0.5, benchmark_use_fft = {}, benchmark_num_runs = 9 )), RandomTransform(p=0.15, transform=MultiplicativeBrightnessTransform( multiplier_range = BGContrast(contrast_range=(0.75, 1.25)), synchronize_channels = False, p_per_channel = 1 )), RandomTransform(p=0.15, transform=ContrastTransform( contrast_range = BGContrast(contrast_range=(0.75, 1.25)), preserve_range = True, synchronize_channels = False, p_per_channel = 1 )), RandomTransform(p=0.25, transform=SimulateLowResolutionTransform( scale = (0.5, 1), synchronize_channels = False, synchronize_axes = True, ignore_axes = (0,), allowed_channels = None, p_per_channel = 0.5, upmodes = {1: 'linear', 2: 'bilinear', 3: 'trilinear'} )), RandomTransform(p=0.1, transform=GammaTransform( gamma = BGContrast(contrast_range=(0.7, 1.5)), p_invert_image = 1, synchronize_channels = False, p_per_channel = 1, p_retain_stats = 1 )), RandomTransform(p=0.3, transform=GammaTransform( gamma = BGContrast(contrast_range=(0.7, 1.5)), p_invert_image = 0, synchronize_channels = False, p_per_channel = 1, p_retain_stats = 1 )), RemoveLabelTansform( segmentation_channels = None, label_value = -1, set_to = 0 ), DownsampleSegForDSTransform( ds_scales = [[np.float64(1.0), np.float64(1.0), np.float64(1.0)], [np.float64(1.0), np.float64(0.5), np.float64(0.5)], [np.float64(0.5), np.float64(0.25), np.float64(0.25)], [np.float64(0.25), np.float64(0.125), np.float64(0.125)], [np.float64(0.125), np.float64(0.0625), np.float64(0.0625)]] )] )",
    "dataloader_train.num_processes": "12",
    "dataloader_train.transform": "None",
    

</issue_description>

<agent_instructions>Create a PR that includes the suggested edit.</agent_instructions>

Comments on the Issue (you are @copilot in this section)

@TaWald Hey @anan1504 Thanks for this, it ...

💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copy link
Member

@TaWald TaWald left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Included the proposed changes to log the augmentation pipeline parameters.

@TaWald TaWald marked this pull request as ready for review January 7, 2026 09:23
Copilot AI changed the title [WIP] Fix debug information logging in dataloader Log actual augmentation transforms in debug.json Jan 7, 2026
Copilot AI requested a review from TaWald January 7, 2026 09:23
@TaWald
Copy link
Member

TaWald commented Jan 7, 2026

PR of #2933

Copy link
Member

@TaWald TaWald left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adds details of training and validation augmentation pipeline to debug.json

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

_save_debug_information() prints "dataloader_train.transform": "None" everytime despite augmentation applied

3 participants