Skip to content

Commit 9fccfaa

Browse files
committed
Fix syntax error and document :last_token_pooling option
- Remove duplicate :scale_attention_weights in transformer block_opts_keys - Add :last_token_pooling to text embedding documentation
1 parent 660ef1b commit 9fccfaa

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

lib/bumblebee/layers/transformer.ex

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,6 @@ defmodule Bumblebee.Layers.Transformer do
5757
:rotary_embedding,
5858
:query_norm,
5959
:key_norm
60-
:scale_attention_weights
6160
]
6261

6362
opts =

lib/bumblebee/text.ex

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -385,6 +385,9 @@ defmodule Bumblebee.Text do
385385
Note that we currently assume that the CLS token is the first token
386386
in the sequence
387387
388+
* `:last_token_pooling` - takes the embedding for the last non-padding
389+
token in each sequence
390+
388391
By default no pooling is applied
389392
390393
* `:embedding_processor` - a post-processing step to apply to the

0 commit comments

Comments
 (0)