Releases: Riccorl/transformers-embedder
Releases · Riccorl/transformers-embedder
1.6.2
16 Mar 16:08
Compare
Sorry, something went wrong.
No results found
to_tensor only for requested fileds. Use add_to_tensor_inputs to add custom fields.
1.6.1
16 Mar 15:24
Compare
Sorry, something went wrong.
No results found
to_tensor only for valid instances
1.6
10 Mar 11:26
Compare
Sorry, something went wrong.
No results found
pad_batch function more general
pad logic for custom fields using add_padding_ops
model output can include all the HuggingFace outputs, using return_all=True in TransformerEmbedder
1.5.1
05 Mar 14:09
Compare
Sorry, something went wrong.
No results found
Exposed has_double_sep and has_starting_token as Tokenizer properties
1.4.8
03 Mar 17:16
Compare
Sorry, something went wrong.
No results found
1.4.7
03 Mar 16:56
Compare
Sorry, something went wrong.
No results found
Added properties to Tokenizer, exposing model special tokens
Expose num_special_tokens to get the number of special tokens the model needs
1.4.6
25 Feb 10:24
Compare
Sorry, something went wrong.
No results found
Exposed save_pretrained in TransformerEmbedder
1.4.5
23 Feb 14:41
Compare
Sorry, something went wrong.
No results found
1.4.4
23 Feb 10:33
Compare
Sorry, something went wrong.
No results found
Fixed bug with double-SEP models
1.4.3
23 Feb 08:06
Compare
Sorry, something went wrong.
No results found
Return dictionary has word_mask parameter, in the size of the original sentence (before sub-tokens)