Skip to content

franrolotti/neural-networks-for-NLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Neural Networks for NLP

This repository contains a collection of Jupyter notebooks exploring neural network architectures for Natural Language Processing (NLP).
It was developed as part of coursework and self-study to practice fundamental and advanced concepts in deep learning for text.


πŸ“‚ Repository Structure

  • word_embeddings.ipynb β€” Introduction to word embeddings (Word2Vec, GloVe, embedding matrix analysis).
  • lstm.ipynb β€” Language modeling with LSTM networks.
  • gru.ipynb β€” Sequence modeling using GRU networks.
  • language_modelling_lstm.ipynb β€” Hands-on example of language modeling with LSTM.
  • bidirectional_rnn.ipynb β€” Implementation of bidirectional RNNs and analysis of embedding matrix evolution.
  • data/ β€” Dataset(s) and preprocessed files used for experiments.
  • README.md β€” Project documentation (this file).

πŸš€ Features & Goals

  • Build and train RNN-based models for sequence and language modeling tasks.
  • Compare different architectures: RNN, LSTM, GRU, Bidirectional RNN.
  • Explore word embeddings and their evolution during training.
  • Provide educational examples for learners interested in deep learning and NLP.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors