Skip to content

Commit a12354b

Browse files
author
sfluegel
committed
increase electra vocab size
1 parent 272446d commit a12354b

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

chebai/models/electra.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -329,7 +329,7 @@ def forward(self, data: Dict[str, Tensor], **kwargs: Any) -> Dict[str, Any]:
329329
except RuntimeError as e:
330330
print(f"RuntimeError at forward: {e}")
331331
print(f'data[features]: {data["features"]}')
332-
raise Exception
332+
raise e
333333
inp = self.word_dropout(inp)
334334
electra = self.electra(inputs_embeds=inp, **kwargs)
335335
d = electra.last_hidden_state[:, 0, :]

configs/model/electra.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ init_args:
33
optimizer_kwargs:
44
lr: 1e-3
55
config:
6-
vocab_size: 1400
6+
vocab_size: 8500
77
max_position_embeddings: 1800
88
num_attention_heads: 8
99
num_hidden_layers: 6

0 commit comments

Comments
 (0)