It seems that the word embedding are kept static during training. How to make the embedding changeable in backpropagation?