Skip to content

Commit a36955a

Browse files
committed
docs: enhance docstring for CSVLogger class with detailed usage examples and file structure
1 parent 1fc44a7 commit a36955a

File tree

1 file changed

+51
-6
lines changed

1 file changed

+51
-6
lines changed

src/lightning/pytorch/loggers/csv_logs.py

Lines changed: 51 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -70,15 +70,60 @@ def log_hparams(self, params: dict[str, Any]) -> None:
7070
class CSVLogger(Logger, FabricCSVLogger):
7171
r"""Log to local file system in CSV and YAML format.
7272
73-
Metrics are logged to CSV format while hyperparameters are logged to YAML format.
73+
**Metrics** (from ``self.log()`` calls) are logged to CSV format while **hyperparameters**
74+
(from ``self.log_hyperparams()`` or ``self.save_hyperparameters()``) are logged to YAML format.
7475
75-
Logs are saved to ``os.path.join(save_dir, name, version)``.
76+
Logs are saved to ``os.path.join(save_dir, name, version)``. The CSV file is named ``metrics.csv``
77+
and the YAML file is named ``hparams.yaml``.
78+
79+
This logger supports logging to remote filesystems via ``fsspec``. Make sure you have it installed.
7680
7781
Example:
78-
>>> from lightning.pytorch import Trainer
79-
>>> from lightning.pytorch.loggers import CSVLogger
80-
>>> logger = CSVLogger("logs", name="my_exp_name")
81-
>>> trainer = Trainer(logger=logger)
82+
83+
.. code-block:: python
84+
85+
from lightning.pytorch import Trainer
86+
from lightning.pytorch.loggers import CSVLogger
87+
88+
# Basic usage
89+
logger = CSVLogger("logs", name="my_exp_name")
90+
trainer = Trainer(logger=logger)
91+
92+
Use the logger anywhere in your :class:`~lightning.pytorch.core.LightningModule` as follows:
93+
94+
.. code-block:: python
95+
96+
import torch
97+
from lightning.pytorch import LightningModule
98+
99+
class LitModel(LightningModule):
100+
def __init__(self, learning_rate=0.001, batch_size=32):
101+
super().__init__()
102+
# This will log hyperparameters to hparams.yaml
103+
self.save_hyperparameters()
104+
105+
def training_step(self, batch, batch_idx):
106+
loss = self.compute_loss(batch)
107+
# This will log metrics to metrics.csv
108+
self.log("train_loss", loss)
109+
return loss
110+
111+
def configure_optimizers(self):
112+
return torch.optim.Adam(self.parameters(), lr=self.hparams.learning_rate)
113+
114+
You can also manually log hyperparameters:
115+
116+
.. code-block:: python
117+
118+
# Log additional hyperparameters manually
119+
logger.log_hyperparams({"dropout": 0.2, "optimizer": "adam"})
120+
121+
**File Structure:**
122+
123+
The logger creates the following files in the log directory:
124+
125+
- ``metrics.csv``: Contains metrics from ``self.log()`` calls
126+
- ``hparams.yaml``: Contains hyperparameters from ``self.save_hyperparameters()`` or ``self.log_hyperparams()``
82127
83128
Args:
84129
save_dir: Save directory

0 commit comments

Comments
 (0)