Seq2seq model for Neural Machine Translation (NMT) from Spanish to English implemented in Keras
This task is performed using a encoder-decoder LSTM model. In this architecture, the input sequence is encoded by a front-end model called the encoder then decoded word by word by a backend model called the decoder.
The dataset was selected from the Manythings.org website that created a database of translations ( from the Tatoeba Project ) to be used with the ANKI flash card software