Skip to content

Commit 243aeb7

Browse files
authored
Fix Gradient Checkpointing for Deberta & Deberta-V2 using PEFT / Adapters (#35898)
Replace In-Place Operations for Deberta and Deberta-V2
1 parent 8a2f062 commit 243aeb7

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

src/transformers/models/deberta/modeling_deberta.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -418,10 +418,10 @@ def forward(self, input_ids=None, token_type_ids=None, position_ids=None, mask=N
418418

419419
embeddings = inputs_embeds
420420
if self.position_biased_input:
421-
embeddings += position_embeddings
421+
embeddings = embeddings + position_embeddings
422422
if self.token_type_embeddings is not None:
423423
token_type_embeddings = self.token_type_embeddings(token_type_ids)
424-
embeddings += token_type_embeddings
424+
embeddings = embeddings + token_type_embeddings
425425

426426
if self.embed_proj is not None:
427427
embeddings = self.embed_proj(embeddings)

src/transformers/models/deberta_v2/modeling_deberta_v2.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -551,10 +551,10 @@ def forward(self, input_ids=None, token_type_ids=None, position_ids=None, mask=N
551551

552552
embeddings = inputs_embeds
553553
if self.position_biased_input:
554-
embeddings += position_embeddings
554+
embeddings = embeddings + position_embeddings
555555
if self.token_type_embeddings is not None:
556556
token_type_embeddings = self.token_type_embeddings(token_type_ids)
557-
embeddings += token_type_embeddings
557+
embeddings = embeddings + token_type_embeddings
558558

559559
if self.embed_proj is not None:
560560
embeddings = self.embed_proj(embeddings)

0 commit comments

Comments
 (0)