This is the official companion repository to the book GANs in Action: Deep Learning with Generative Adversarial Networks by Jakub Langr and Vladimir Bok, published by Manning Publications.
This repo lets you reproduce, study, and extend every handsβon example from the book. The notebooks walk through every major variant in the GAN family, from the original vanilla GAN to CycleGAN, using Keras/TensorFlow.
- Overview
- Repository Structure
- Canonical GAN Papers
- Getting Started
- Chapter Implementations
- Educational Resources
- Best Practices
- Community Resources
- Citation
- License
This repository contains practical implementations of various Generative Adversarial Network architectures discussed in the book "GANs in Action". Each chapter includes Jupyter notebooks with fully functional code examples that demonstrate key concepts and techniques in GAN development.
- Fundamental concepts of generative modeling and adversarial training
- Implementation of various GAN architectures from scratch
- Best practices for training stable GANs
- Real-world applications of GANs
- Advanced techniques for improving GAN performance
gans-in-action/
βββ chapter-2/ # Autoencoders
βββ chapter-3/ # VanillaΒ GAN
βββ chapter-4/ # Deep Convolutional GAN (DCGAN)
βββ chapter-6/ # Progressive GAN
βββ chapter-7/ # Semi-Supervised GAN
βββ chapter-8/ # Conditional GAN
βββ chapter-9/ # CycleGAN
βββ chapter-10/ # Adversarial examples
βββ requirements.txt # Python dependencies
Each implementation in this repository is based on groundbreaking research. Here are the canonical papers for each GAN architecture covered:
Paper: Generative Adversarial Networks
Authors: Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio
Year: 2014
Key Contribution: Introduced the foundational GAN framework with adversarial training between generator and discriminator networks.
Paper: Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks
Authors: Alec Radford, Luke Metz, Soumith Chintala
Year: 2015
Key Contribution: Established architectural guidelines for stable GAN training using convolutional networks.
Paper: Progressive Growing of GANs for Improved Quality, Stability, and Variation
Authors: Tero Karras, Timo Aila, Samuli Laine, Jaakko Lehtinen
Year: 2017
Key Contribution: Introduced progressive training methodology for generating high-resolution images.
Paper: Semi-Supervised Learning with Generative Adversarial Networks
Authors: Augustus Odena
Year: 2016
Key Contribution: Extended GANs for semi-supervised learning by modifying the discriminator to output class labels.
Paper: Conditional Generative Adversarial Nets
Authors: Mehdi Mirza, Simon Osindero
Year: 2014
Key Contribution: Enabled conditional generation by incorporating label information into both generator and discriminator.
Paper: Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks
Authors: Jun-Yan Zhu, Taesung Park, Phillip Isola, Alexei A. Efros
Year: 2017
Key Contribution: Enabled image-to-image translation without paired training data using cycle consistency loss.
- Python 3.6 or higher
- CUDA-capable GPU (recommended for faster training)
- 8GB+ RAM
- Clone this repository:
git clone https://github.com/GANs-in-Action/gans-in-action.git
cd gans-in-action
- Create a virtual environment:
python -m venv gan_env
source gan_env/bin/activate # On Windows: gan_env\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
Navigate to any chapter directory and launch Jupyter Notebook:
cd chapter-3
jupyter notebook
Open the notebook file (e.g., Chapter_3_GAN.ipynb
) and run the cells sequentially.
- Implementation: Basic GAN for MNIST digit generation
- Key Concepts: Generator and discriminator networks, adversarial loss, training dynamics
- Dataset: MNIST handwritten digits
- Implementation: DCGAN for generating realistic images
- Key Concepts: Convolutional architectures, batch normalization, architectural guidelines
- Dataset: MNIST, CelebA (optional)
- Implementation: Various training techniques and solutions
- Key Concepts: Mode collapse, vanishing gradients, training stability
- Techniques: Label smoothing, feature matching, minibatch discrimination
- Implementation: Progressive growing for high-resolution generation
- Key Concepts: Progressive training, smooth fade-in, minibatch standard deviation
- Dataset: CelebA-HQ
- Implementation: SGAN for improved classification with limited labels
- Key Concepts: Semi-supervised learning, modified discriminator architecture
- Dataset: MNIST with limited labels
- Implementation: CGAN for controlled generation
- Key Concepts: Conditional generation, label embedding, targeted synthesis
- Dataset: MNIST with class conditions
- Implementation: Unpaired image-to-image translation
- Key Concepts: Cycle consistency loss, unpaired translation, domain adaptation
- Dataset: Horse2Zebra, Apple2Orange
- NIPS 2016 Tutorial: Generative Adversarial Networks by Ian Goodfellow
- MIT Deep Learning Course - Includes comprehensive GAN coverage
- Stanford CS231n - Convolutional Neural Networks for Visual Recognition
- Fast.ai Practical Deep Learning - Practical approach to deep learning including GANs
- Ian Goodfellow: Generative Adversarial Networks (NIPS 2016)
- Two Minute Papers: GAN Series - Accessible explanations of latest GAN research
- Lex Fridman Podcast with Ian Goodfellow
- Deep Learning by Ian Goodfellow, Yoshua Bengio, and Aaron Courville
- Chapter 20: Deep Generative Models
- Pattern Recognition and Machine Learning by Christopher Bishop
- Generative Deep Learning by David Foster
- Improved Techniques for Training GANs (2016) - Salimans et al.
- Wasserstein GAN (2017) - Arjovsky et al.
- Spectral Normalization for GANs (2018) - Miyato et al.
- Self-Attention Generative Adversarial Networks (2018) - Zhang et al.
- StyleGAN (2018) - Karras et al.
- GAN Lab - Interactive visualization of GAN training
- TensorFlow GAN Playground - Hands-on DCGAN tutorial
- PyTorch GAN Zoo - Collection of GAN implementations
- Normalize inputs to
[-1, 1]
range - Use different learning rates for generator and discriminator (typically
D_lr > G_lr
) - Monitor training metrics carefully -
D_loss
andG_loss
should be balanced - Use gradient penalties (WGAN-GP) for improved stability
- Apply spectral normalization in discriminator for Lipschitz constraint
- Replace pooling with strided convolutions (discriminator) and fractional-strided convolutions (generator)
- Use BatchNorm in generator and
LayerNorm
/InstanceNorm
in discriminator - Use LeakyReLU in discriminator (
0.2
slope) andReLU
in generator - Avoid fully connected layers for deeper architectures
- Mode collapse - Generator produces limited variety
- Vanishing gradients - Discriminator becomes too strong
- Oscillating losses - Unstable training dynamics
- Memory issues - Use gradient checkpointing for large models
- Inception Score (IS) - Measures quality and diversity
- FrΓ©chet Inception Distance (FID) - Compares feature distributions
- Precision and Recall - Measures quality vs diversity trade-off
- Human evaluation - Still the gold standard for many applications
- r/MachineLearning - Active discussions on latest GAN research
- GAN Discord Server - Community of GAN researchers and practitioners
- Stack Overflow - GAN Tag - Technical Q&A
- TensorFlow GAN (TF-GAN) - TensorFlow GAN library
- PyTorch-GAN - Collection of PyTorch implementations
- Keras-GAN - Keras implementations
- CelebA - Celebrity faces dataset
- LSUN - Large-scale scene understanding
- FFHQ - Flickr-Faces-HQ dataset
- ImageNet - Large-scale image database
If you use this code in your research, please cite the book:
@book{langr2019gans,
title={GANs in Action: Deep Learning with Generative Adversarial Networks},
author={Langr, Jakub and Bok, Vladimir},
year={2019},
publisher={Manning Publications}
}
This code is provided under the MIT License. See the LICENSE
file for details.
We welcome contributions to improve this repository! If you find any issues, have suggestions for improvements, or want to add more comprehensive examples, please feel free to open an issue or submit a pull request.
- Book Website: https://www.manning.com/books/gans-in-action
- Authors: Jakub Langr & Vladimir Bok
- Issues: Please use the GitHub issue tracker
Made with β€οΈ by the GANs in Action team