You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/insights.rst
+22Lines changed: 22 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -117,3 +117,25 @@ Example:
117
117
118
118
mask.shape, label.shape
119
119
# (N, 4, H, W), (N, 4)
120
+
121
+
4. Freezing and unfreezing the encoder
122
+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
123
+
124
+
Sometimes you may want to freeze the encoder during training, e.g. when using pretrained backbones and only fine-tuning the decoder and segmentation head.
125
+
126
+
All segmentation models in SMP provide two helper methods:
# Unfreeze encoder: re-enables training for encoder parameters and normalization layers
136
+
model.unfreeze_encoder()
137
+
138
+
.. important::
139
+
- Freezing sets ``requires_grad = False`` for all encoder parameters.
140
+
- Normalization layers that track running statistics (e.g., BatchNorm and InstanceNorm layers) are set to ``.eval()`` mode to prevent updates to ``running_mean`` and ``running_var``.
141
+
- If you later call ``model.train()``, frozen encoders will remain frozen until you call ``unfreeze_encoder()``.
0 commit comments