Skip to content

Throughput monitor spamming flops_per_batch missing warning #21450

@michaelvay

Description

@michaelvay

Bug description

The warning is triggered every on every update call, can it be moved to the setup in ThroughputMonitor?
https://github.com/Lightning-AI/pytorch-lightning/blame/master/src/lightning/pytorch/callbacks/throughput_monitor.py#L139

What version are you seeing the problem on?

master

Reproduced in studio

No response

How to reproduce the bug

Error messages and logs

# Error messages and logs here please

Environment

Current environment
#- PyTorch Lightning Version (e.g., 2.5.0):
#- PyTorch Version (e.g., 2.5):
#- Python version (e.g., 3.12):
#- OS (e.g., Linux):
#- CUDA/cuDNN version:
#- GPU models and configuration:
#- How you installed Lightning(`conda`, `pip`, source):

More info

No response

cc @ethanwharris @lantiga

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions