Skip to content

Commit c7914e9

Browse files
Update XLNet Model (einsum, outputs' format, k_head_r shape) (#408)
* replace with paddlenlp's einsum op * fix base prefix and shape dismatch * modify outputs of xlnet model * unify output in run_glue.py * unify output in run_glue.py * fix docs and q_head_h shape Co-authored-by: Guo Sheng <[email protected]>
1 parent 028f9ac commit c7914e9

File tree

2 files changed

+69
-129
lines changed

2 files changed

+69
-129
lines changed

examples/language_model/xlnet/run_glue.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ def evaluate(model, loss_fct, metric, data_loader):
8686
global final_res
8787
for batch in data_loader:
8888
input_ids, token_type_ids, attention_mask, labels = batch
89-
logits = model(input_ids, token_type_ids, attention_mask)[0]
89+
logits = model(input_ids, token_type_ids, attention_mask)
9090
loss = loss_fct(logits, labels)
9191
losses.append(loss.detach().numpy())
9292
correct = metric.compute(logits, labels)
@@ -266,7 +266,7 @@ def do_train(args):
266266
for step, batch in enumerate(train_data_loader):
267267
global_step += 1
268268
input_ids, token_type_ids, attention_mask, labels = batch
269-
logits = model(input_ids, token_type_ids, attention_mask)[0]
269+
logits = model(input_ids, token_type_ids, attention_mask)
270270
loss = loss_fct(logits, labels)
271271
loss.backward()
272272
optimizer.step()

0 commit comments

Comments
 (0)