Skip to content

A minimal PyTorch re-implementation of GPT (Generative Pretrained Transformer) training

Notifications You must be signed in to change notification settings

ajay-vikram/GPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Generative Pretrained Transformer (GPT)

This repo contains a naive PyTorch implementation of the GPT model. It is trained on the Tiny-Shakespeare Dataset.

File Structure

├──bigram.py                                --> bigram language model
├──gpt.py                                   --> transformer decoder-only model
└──input.txt                                --> shakespeare training data

Transformer Architecture

GitHub Logo

About

A minimal PyTorch re-implementation of GPT (Generative Pretrained Transformer) training

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages