These are some side projects and experiments I have done in my spare time. I have documented the notebooks, so that they can be used for educational purposes and to deepen the understanding of certain machine learning topics. Currently the following notebooks exist:
- Encoder-Decoder Transformer model for converting a numerical representation of a number into a textual representation (01_encoder-decoder)
- Decoder-only Transformer model for letter concatenation (02_decoder-only)
Soon to come:
- Encoder-only Transformer model for SPAM classification with dot product self-attention and Gaussian adaptive self-attention