Skip to content

Commit fa26d3d

Browse files
authored
Modify Skep Docstring (#1139)
* modify transforner-rst * modify roformer tokenizer * delete modifications * modify skepmodel * modify skep tokenizer * update * modify skeptokenizer * modify bart * update * update * modify bigbird tokenizer * modify bigbirdmodel * modify skepmodel * fix errors
1 parent a6e886b commit fa26d3d

File tree

5 files changed

+519
-333
lines changed

5 files changed

+519
-333
lines changed

paddlenlp/transformers/bart/modeling.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -176,10 +176,7 @@ def __init__(self,
176176
self.encoder = nn.TransformerEncoder(encoder_layer, num_encoder_layers)
177177
self.apply(self.init_weights)
178178

179-
def forward(
180-
self,
181-
input_ids=None,
182-
attention_mask=None):
179+
def forward(self, input_ids=None, attention_mask=None):
183180
if input_ids is None:
184181
raise ValueError("Input_ids cannot be None.")
185182
inputs_embeds = self.embed_tokens(input_ids)

0 commit comments

Comments
 (0)