This repository contains a collection of Jupyter notebooks exploring neural network architectures for Natural Language Processing (NLP).
It was developed as part of coursework and self-study to practice fundamental and advanced concepts in deep learning for text.
word_embeddings.ipynbβ Introduction to word embeddings (Word2Vec, GloVe, embedding matrix analysis).lstm.ipynbβ Language modeling with LSTM networks.gru.ipynbβ Sequence modeling using GRU networks.language_modelling_lstm.ipynbβ Hands-on example of language modeling with LSTM.bidirectional_rnn.ipynbβ Implementation of bidirectional RNNs and analysis of embedding matrix evolution.data/β Dataset(s) and preprocessed files used for experiments.README.mdβ Project documentation (this file).
- Build and train RNN-based models for sequence and language modeling tasks.
- Compare different architectures: RNN, LSTM, GRU, Bidirectional RNN.
- Explore word embeddings and their evolution during training.
- Provide educational examples for learners interested in deep learning and NLP.