Skip to content

Ne-x-tr-on/gpt2-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 

Repository files navigation

GPT-2 From Scratch

This repository provides a minimal, educational, and annotated implementation of GPT-2, designed for learning and experimentation. The code is written from scratch to help users understand the inner workings of transformer-based language models.

Features

  • Pure Python implementation of core GPT-2 components
  • Step-by-step explanations and comments throughout the code
  • Minimal dependencies for easy setup and experimentation
  • Example scripts for training and inference
  • Configurable model parameters

Getting Started

Prerequisites

  • Python 3.7+
  • NumPy
  • Torch (if using GPU acceleration)

Installation

Clone this repository:

git clone https://github.com/Cohegen/gpt2-from-scratch.git
cd gpt2-from-scratch

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages