Skip to content

Commit 42a1125

Browse files
authored
2516 Change reference code links of PyTorch and ignite to API docs (#2518)
* [DLMED] update lr_scheduler links Signed-off-by: Nic Ma <[email protected]> * [DLMED] update all the other links Signed-off-by: Nic Ma <[email protected]> * [DLMED] fix typo Signed-off-by: Nic Ma <[email protected]>
1 parent f3d436a commit 42a1125

File tree

15 files changed

+38
-30
lines changed

15 files changed

+38
-30
lines changed

monai/data/dataloader.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ class DataLoader(_TorchDataLoader):
3535
See: :py:class:`monai.transforms.Compose`.
3636
3737
For more details about :py:class:`torch.utils.data.DataLoader`, please see:
38-
https://github.com/pytorch/pytorch/blob/master/torch/utils/data/dataloader.py
38+
https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader.
3939
4040
For example, to construct a randomized dataset and iterate with the data loader:
4141

monai/data/samplers.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ class DistributedSampler(_TorchDistributedSampler):
3434
kwargs: additional arguments for `DistributedSampler` super class, can be `seed` and `drop_last`.
3535
3636
More information about DistributedSampler, please check:
37-
https://github.com/pytorch/pytorch/blob/master/torch/utils/data/distributed.py
37+
https://pytorch.org/docs/stable/data.html#torch.utils.data.distributed.DistributedSampler.
3838
3939
"""
4040

@@ -61,7 +61,7 @@ class DistributedWeightedRandomSampler(DistributedSampler):
6161
"""
6262
Extend the `DistributedSampler` to support weighted sampling.
6363
Refer to `torch.utils.data.WeightedRandomSampler`, for more details please check:
64-
https://github.com/pytorch/pytorch/blob/master/torch/utils/data/sampler.py#L150
64+
https://pytorch.org/docs/stable/data.html#torch.utils.data.WeightedRandomSampler.
6565
6666
Args:
6767
dataset: Dataset used for sampling.

monai/data/utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -817,7 +817,7 @@ def partition_dataset(
817817
Split the dataset into N partitions. It can support shuffle based on specified random seed.
818818
Will return a set of datasets, every dataset contains 1 partition of original dataset.
819819
And it can split the dataset based on specified ratios or evenly split into `num_partitions`.
820-
Refer to: https://github.com/pytorch/pytorch/blob/master/torch/utils/data/distributed.py.
820+
Refer to: https://pytorch.org/docs/stable/distributed.html#module-torch.distributed.launch.
821821
822822
Note:
823823
It also can be used to partition dataset for ranks in distributed training.

monai/engines/evaluator.py

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,8 @@ class Evaluator(Workflow):
6464
event_names: additional custom ignite events that will register to the engine.
6565
new events can be a list of str or `ignite.engine.events.EventEnum`.
6666
event_to_attr: a dictionary to map an event to a state attribute, then add to `engine.state`.
67-
for more details, check: https://github.com/pytorch/ignite/blob/v0.4.4.post1/ignite/engine/engine.py#L160
67+
for more details, check: https://pytorch.org/ignite/generated/ignite.engine.engine.Engine.html
68+
#ignite.engine.engine.Engine.register_events.
6869
decollate: whether to decollate the batch-first data to a list of data after model computation,
6970
default to `True`. if `False`, postprocessing will be ignored as the `monai.transforms` module
7071
assumes channel-first data.
@@ -166,7 +167,8 @@ class SupervisedEvaluator(Evaluator):
166167
event_names: additional custom ignite events that will register to the engine.
167168
new events can be a list of str or `ignite.engine.events.EventEnum`.
168169
event_to_attr: a dictionary to map an event to a state attribute, then add to `engine.state`.
169-
for more details, check: https://github.com/pytorch/ignite/blob/v0.4.4.post1/ignite/engine/engine.py#L160
170+
for more details, check: https://pytorch.org/ignite/generated/ignite.engine.engine.Engine.html
171+
#ignite.engine.engine.Engine.register_events.
170172
decollate: whether to decollate the batch-first data to a list of data after model computation,
171173
default to `True`. if `False`, postprocessing will be ignored as the `monai.transforms` module
172174
assumes channel-first data.
@@ -293,7 +295,8 @@ class EnsembleEvaluator(Evaluator):
293295
event_names: additional custom ignite events that will register to the engine.
294296
new events can be a list of str or `ignite.engine.events.EventEnum`.
295297
event_to_attr: a dictionary to map an event to a state attribute, then add to `engine.state`.
296-
for more details, check: https://github.com/pytorch/ignite/blob/v0.4.4.post1/ignite/engine/engine.py#L160
298+
for more details, check: https://pytorch.org/ignite/generated/ignite.engine.engine.Engine.html
299+
#ignite.engine.engine.Engine.register_events.
297300
decollate: whether to decollate the batch-first data to a list of data after model computation,
298301
default to `True`. if `False`, postprocessing will be ignored as the `monai.transforms` module
299302
assumes channel-first data.

monai/engines/trainer.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,8 @@ class SupervisedTrainer(Trainer):
9494
event_names: additional custom ignite events that will register to the engine.
9595
new events can be a list of str or `ignite.engine.events.EventEnum`.
9696
event_to_attr: a dictionary to map an event to a state attribute, then add to `engine.state`.
97-
for more details, check: https://github.com/pytorch/ignite/blob/v0.4.4.post1/ignite/engine/engine.py#L160
97+
for more details, check: https://pytorch.org/ignite/generated/ignite.engine.engine.Engine.html
98+
#ignite.engine.engine.Engine.register_events.
9899
decollate: whether to decollate the batch-first data to a list of data after model computation,
99100
default to `True`. if `False`, postprocessing will be ignored as the `monai.transforms` module
100101
assumes channel-first data.

monai/engines/utils.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@
3737
class IterationEvents(EventEnum):
3838
"""
3939
Additional Events engine can register and trigger in the iteration process.
40-
Refer to the example in ignite: https://github.com/pytorch/ignite/blob/master/ignite/engine/events.py#L146
40+
Refer to the example in ignite: https://pytorch.org/ignite/generated/ignite.engine.events.EventEnum.html.
4141
These Events can be triggered during training iteration:
4242
`FORWARD_COMPLETED` is the Event when `network(image, label)` completed.
4343
`LOSS_COMPLETED` is the Event when `loss(pred, label)` completed.
@@ -106,7 +106,8 @@ def default_prepare_batch(
106106
) -> Union[Tuple[torch.Tensor, Optional[torch.Tensor]], torch.Tensor]:
107107
"""
108108
Default function to prepare the data for current iteration.
109-
Refer to ignite: https://github.com/pytorch/ignite/blob/v0.4.2/ignite/engine/__init__.py#L28.
109+
Refer to ignite: https://pytorch.org/ignite/v0.4.5/generated/ignite.engine.create_supervised_trainer.html
110+
#ignite.engine.create_supervised_trainer.
110111
111112
Returns:
112113
image, label(optional).

monai/engines/workflow.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,8 @@ class Workflow(IgniteEngine): # type: ignore[valid-type, misc] # due to optiona
7272
event_names: additional custom ignite events that will register to the engine.
7373
new events can be a list of str or `ignite.engine.events.EventEnum`.
7474
event_to_attr: a dictionary to map an event to a state attribute, then add to `engine.state`.
75-
for more details, check: https://github.com/pytorch/ignite/blob/v0.4.4.post1/ignite/engine/engine.py#L160
75+
for more details, check: https://pytorch.org/ignite/generated/ignite.engine.engine.Engine.html
76+
#ignite.engine.engine.Engine.register_events.
7677
decollate: whether to decollate the batch-first data to a list of data after model computation,
7778
default to `True`. if `False`, postprocessing will be ignored as the `monai.transforms` module
7879
assumes channel-first data.

monai/handlers/checkpoint_loader.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,8 +60,9 @@ class CheckpointLoader:
6060
checkpoint, so skip loading checkpoint for optimizer.
6161
6262
For more details about loading checkpoint, please refer to:
63-
https://github.com/pytorch/ignite/blob/v0.4.5/ignite/handlers/checkpoint.py#L499.
64-
https://github.com/pytorch/pytorch/blob/v1.9.0/torch/nn/modules/module.py#L1354.
63+
https://pytorch.org/ignite/v0.4.5/generated/ignite.handlers.checkpoint.Checkpoint.html
64+
#ignite.handlers.checkpoint.Checkpoint.load_objects.
65+
https://pytorch.org/docs/stable/generated/torch.nn.Module.html#torch.nn.Module.load_state_dict.
6566
6667
"""
6768

monai/handlers/checkpoint_saver.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,8 +57,9 @@ class CheckpointSaver:
5757
key_metric_filename: set a fixed filename to set the best metric model, if not None,
5858
`key_metric_n_saved` should be 1 and only keep the best metric model.
5959
key_metric_save_state: whether to save the tracking list of key metric in the checkpoint file.
60-
if `True`, then will save an object in the checkpoint file with key `checkpointer` to be consistent
61-
with ignite: https://github.com/pytorch/ignite/blob/master/ignite/handlers/checkpoint.py#L99.
60+
if `True`, then will save an object in the checkpoint file with key `checkpointer` to be
61+
consistent with the `include_self` arg of `Checkpoint` in ignite:
62+
https://pytorch.org/ignite/v0.4.5/generated/ignite.handlers.checkpoint.Checkpoint.html.
6263
typically, it's used to resume training and compare current metric with previous N values.
6364
key_metric_greater_or_equal: if `True`, the latest equally scored model is stored. Otherwise,
6465
save the the first equally scored model. default to `False`.

monai/handlers/parameter_scheduler.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -123,8 +123,8 @@ def _exponential(initial_value: float, gamma: float, current_step: int) -> float
123123
"""
124124
Decays the parameter value by gamma every step.
125125
126-
Based on the closed form of ExponentialLR from Pytorch
127-
https://github.com/pytorch/pytorch/blob/master/torch/optim/lr_scheduler.py#L457
126+
Based on the closed form of ExponentialLR from Pytorch:
127+
https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.ExponentialLR.html.
128128
129129
Args:
130130
initial_value (float): Starting value of the parameter.
@@ -141,8 +141,8 @@ def _step(initial_value: float, gamma: float, step_size: int, current_step: int)
141141
"""
142142
Decays the parameter value by gamma every step_size.
143143
144-
Based on StepLR from Pytorch.
145-
https://github.com/pytorch/pytorch/blob/master/torch/optim/lr_scheduler.py#L377
144+
Based on StepLR from Pytorch:
145+
https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.StepLR.html.
146146
147147
Args:
148148
initial_value (float): Starting value of the parameter.
@@ -161,7 +161,7 @@ def _multistep(initial_value: float, gamma: float, milestones: List[int], curren
161161
Decays the parameter value by gamma once the number of steps reaches one of the milestones.
162162
163163
Based on MultiStepLR from Pytorch.
164-
https://github.com/pytorch/pytorch/blob/master/torch/optim/lr_scheduler.py#L424
164+
https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiStepLR.html.
165165
166166
Args:
167167
initial_value (float): Starting value of the parameter.

0 commit comments

Comments
 (0)