How to change the schedule of the built-in PyTorch Profiler? #12611
Replies: 3 comments 2 replies
-
Hi @dpaleka! Haven't tried it myself yet, but import torch
from pytorch_lightning import Trainer
from pytorch_lightning.profiler import PyTorchProfiler
profiler = PyTorchProfiler(
schedule=torch.profiler.schedule(
wait=2,
warmup=2,
active=6,
repeat=1),
on_trace_ready=tensorboard_trace_handler,
with_stack=True,
)
trainer = Trainer(..., profiler=profiler) Related doc pages: |
Beta Was this translation helpful? Give feedback.
-
Is there any schedule-related test for https://pytorch-lightning.readthedocs.io/en/stable/_modules/pytorch_lightning/profiler/pytorch.html#PyTorchProfiler? I can't make this work properly. The default schedule seems to be Can we transfer this discussion to an issue? The schedule parameter should at least be documented. |
Beta Was this translation helpful? Give feedback.
-
Are there any updates to that topic? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
When using PyTorch Profiler in plain PyTorch, one can change the profiling schedule, see e.g. the arguments in the first snippet here:
How to do this in PyTorch Lightning? There is nothing about the profiler schedule in the docs.
Beta Was this translation helpful? Give feedback.
All reactions