Skip to content

On padding the input with -1 #4

@hquan7395

Description

@hquan7395

The comment at

# pad tokens with -1, which results in a zero vector with embedding look-ups

says that padding -1 corresponds to zero in the embedding layer. But when I check the trained params with

state['params']['encoder']['encoder']['embedding'][-1]

I got

Array([-0.06191076, 0.18668179, -0.25837645, 0.1488393 , 0.11525428, 0.18183647, -0.30500978, 0.07464553, -0.3888737 , 0.10938968, -0.02655256, -0.349307 , -0.03291983, -0.040106 , 0.2369387 , -0.01890478, -0.21027383, -0.00453783, -0.1900245 , 0.01763803, -0.09983198, -0.3064041 , -0.14400026, -0.05854894, 0.10559192, 0.04855605, 0.1330904 , 0.00867153, -0.04014621, 0.23416986, 0.14465654, 0.3036504 ], dtype=float32)

Is this not intended?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions