Skip to content

Commit 2a596f5

Browse files
pstjohnSunMarc
andauthored
[ESM] add accepts_loss_kwargs=False to EsmPreTrainedModel (#41006)
add accepts_loss_kwargs=False to EsmPreTrainedModel Signed-off-by: Peter St. John <[email protected]> Co-authored-by: Marc Sun <[email protected]>
1 parent 3edd804 commit 2a596f5

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

src/transformers/models/esm/modeling_esm.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -589,6 +589,7 @@ class EsmPreTrainedModel(PreTrainedModel):
589589
config: EsmConfig
590590
base_model_prefix = "esm"
591591
supports_gradient_checkpointing = True
592+
accepts_loss_kwargs = False
592593
_no_split_modules = ["EsmLayer", "EsmFoldTriangularSelfAttentionBlock", "EsmEmbeddings"]
593594
_keys_to_ignore_on_load_unexpected = ["position_embeddings.weight"]
594595
_supports_flash_attn = True

0 commit comments

Comments
 (0)