Skip to content

GANs-in-Action/gans-in-action

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

GANsβ€―inβ€―ActionΒ β€” Code Companion

Python 3.6+ TensorFlow 1.8.0+ Keras 2.1.6+ License MIT

This is the official companion repository to the book GANs in Action: Deep Learning with Generative Adversarial Networks by Jakub Langr and Vladimir Bok, published by Manning Publications.

This repo lets you reproduce, study, and extend every hands‑on example from the book. The notebooks walk through every major variant in the GAN family, from the original vanilla GAN to CycleGAN, using Keras/TensorFlow.

πŸ“š Table of Contents

🎯 Overview

This repository contains practical implementations of various Generative Adversarial Network architectures discussed in the book "GANs in Action". Each chapter includes Jupyter notebooks with fully functional code examples that demonstrate key concepts and techniques in GAN development.

What You'll Learn

  • Fundamental concepts of generative modeling and adversarial training
  • Implementation of various GAN architectures from scratch
  • Best practices for training stable GANs
  • Real-world applications of GANs
  • Advanced techniques for improving GAN performance

πŸ“‚ Repository Structure

gans-in-action/
β”œβ”€β”€ chapter-2/          # Autoencoders
β”œβ”€β”€ chapter-3/          # VanillaΒ GAN
β”œβ”€β”€ chapter-4/          # Deep Convolutional GAN (DCGAN)
β”œβ”€β”€ chapter-6/          # Progressive GAN
β”œβ”€β”€ chapter-7/          # Semi-Supervised GAN
β”œβ”€β”€ chapter-8/          # Conditional GAN
β”œβ”€β”€ chapter-9/          # CycleGAN
β”œβ”€β”€ chapter-10/         # Adversarial examples
└── requirements.txt    # Python dependencies

πŸ“„ Canonical GAN Papers

Each implementation in this repository is based on groundbreaking research. Here are the canonical papers for each GAN architecture covered:

Original GAN (Chapter 3)

Paper: Generative Adversarial Networks
Authors: Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio
Year: 2014
Key Contribution: Introduced the foundational GAN framework with adversarial training between generator and discriminator networks.

Deep Convolutional GAN - DCGAN (Chapter 4)

Paper: Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks
Authors: Alec Radford, Luke Metz, Soumith Chintala
Year: 2015
Key Contribution: Established architectural guidelines for stable GAN training using convolutional networks.

Progressive GAN (Chapter 6)

Paper: Progressive Growing of GANs for Improved Quality, Stability, and Variation
Authors: Tero Karras, Timo Aila, Samuli Laine, Jaakko Lehtinen
Year: 2017
Key Contribution: Introduced progressive training methodology for generating high-resolution images.

Semi-Supervised GAN (Chapter 7)

Paper: Semi-Supervised Learning with Generative Adversarial Networks
Authors: Augustus Odena
Year: 2016
Key Contribution: Extended GANs for semi-supervised learning by modifying the discriminator to output class labels.

Conditional GAN - CGAN (Chapter 8)

Paper: Conditional Generative Adversarial Nets
Authors: Mehdi Mirza, Simon Osindero
Year: 2014
Key Contribution: Enabled conditional generation by incorporating label information into both generator and discriminator.

CycleGAN (Chapter 9)

Paper: Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks
Authors: Jun-Yan Zhu, Taesung Park, Phillip Isola, Alexei A. Efros
Year: 2017
Key Contribution: Enabled image-to-image translation without paired training data using cycle consistency loss.

πŸš€ Getting Started

Prerequisites

  • Python 3.6 or higher
  • CUDA-capable GPU (recommended for faster training)
  • 8GB+ RAM

Installation

  1. Clone this repository:
git clone https://github.com/GANs-in-Action/gans-in-action.git
cd gans-in-action
  1. Create a virtual environment:
python -m venv gan_env
source gan_env/bin/activate  # On Windows: gan_env\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt

Running the Examples

Navigate to any chapter directory and launch Jupyter Notebook:

cd chapter-3
jupyter notebook

Open the notebook file (e.g., Chapter_3_GAN.ipynb) and run the cells sequentially.

πŸ“– Chapter Implementations

Chapter 3: Your First GAN

  • Implementation: Basic GAN for MNIST digit generation
  • Key Concepts: Generator and discriminator networks, adversarial loss, training dynamics
  • Dataset: MNIST handwritten digits

Chapter 4: Deep Convolutional GAN (DCGAN)

  • Implementation: DCGAN for generating realistic images
  • Key Concepts: Convolutional architectures, batch normalization, architectural guidelines
  • Dataset: MNIST, CelebA (optional)

Chapter 5: Training and Common Challenges

  • Implementation: Various training techniques and solutions
  • Key Concepts: Mode collapse, vanishing gradients, training stability
  • Techniques: Label smoothing, feature matching, minibatch discrimination

Chapter 6: Progressive GAN

  • Implementation: Progressive growing for high-resolution generation
  • Key Concepts: Progressive training, smooth fade-in, minibatch standard deviation
  • Dataset: CelebA-HQ

Chapter 7: Semi-Supervised GAN

  • Implementation: SGAN for improved classification with limited labels
  • Key Concepts: Semi-supervised learning, modified discriminator architecture
  • Dataset: MNIST with limited labels

Chapter 8: Conditional GAN

  • Implementation: CGAN for controlled generation
  • Key Concepts: Conditional generation, label embedding, targeted synthesis
  • Dataset: MNIST with class conditions

Chapter 9: CycleGAN

  • Implementation: Unpaired image-to-image translation
  • Key Concepts: Cycle consistency loss, unpaired translation, domain adaptation
  • Dataset: Horse2Zebra, Apple2Orange

πŸ“š Educational Resources

Online Courses and Tutorials

  1. NIPS 2016 Tutorial: Generative Adversarial Networks by Ian Goodfellow
  2. MIT Deep Learning Course - Includes comprehensive GAN coverage
  3. Stanford CS231n - Convolutional Neural Networks for Visual Recognition
  4. Fast.ai Practical Deep Learning - Practical approach to deep learning including GANs

Video Lectures

  1. Ian Goodfellow: Generative Adversarial Networks (NIPS 2016)
  2. Two Minute Papers: GAN Series - Accessible explanations of latest GAN research
  3. Lex Fridman Podcast with Ian Goodfellow

Books and Reading Materials

  1. Deep Learning by Ian Goodfellow, Yoshua Bengio, and Aaron Courville
    • Chapter 20: Deep Generative Models
  2. Pattern Recognition and Machine Learning by Christopher Bishop
  3. Generative Deep Learning by David Foster

Research Papers - Essential Reading

  1. Improved Techniques for Training GANs (2016) - Salimans et al.
  2. Wasserstein GAN (2017) - Arjovsky et al.
  3. Spectral Normalization for GANs (2018) - Miyato et al.
  4. Self-Attention Generative Adversarial Networks (2018) - Zhang et al.
  5. StyleGAN (2018) - Karras et al.

Interactive Resources

  1. GAN Lab - Interactive visualization of GAN training
  2. TensorFlow GAN Playground - Hands-on DCGAN tutorial
  3. PyTorch GAN Zoo - Collection of GAN implementations

πŸ† Best Practices

Training Tips

  1. Normalize inputs to [-1, 1] range
  2. Use different learning rates for generator and discriminator (typically D_lr > G_lr)
  3. Monitor training metrics carefully - D_loss and G_loss should be balanced
  4. Use gradient penalties (WGAN-GP) for improved stability
  5. Apply spectral normalization in discriminator for Lipschitz constraint

Architecture Guidelines

  1. Replace pooling with strided convolutions (discriminator) and fractional-strided convolutions (generator)
  2. Use BatchNorm in generator and LayerNorm/InstanceNorm in discriminator
  3. Use LeakyReLU in discriminator (0.2 slope) and ReLU in generator
  4. Avoid fully connected layers for deeper architectures

Common Pitfalls to Avoid

  1. Mode collapse - Generator produces limited variety
  2. Vanishing gradients - Discriminator becomes too strong
  3. Oscillating losses - Unstable training dynamics
  4. Memory issues - Use gradient checkpointing for large models

Evaluation Metrics

  1. Inception Score (IS) - Measures quality and diversity
  2. FrΓ©chet Inception Distance (FID) - Compares feature distributions
  3. Precision and Recall - Measures quality vs diversity trade-off
  4. Human evaluation - Still the gold standard for many applications

🌐 Community Resources

Forums and Discussion

Tools and Frameworks

Datasets

  • CelebA - Celebrity faces dataset
  • LSUN - Large-scale scene understanding
  • FFHQ - Flickr-Faces-HQ dataset
  • ImageNet - Large-scale image database

πŸ“ Citation

If you use this code in your research, please cite the book:

@book{langr2019gans,
  title={GANs in Action: Deep Learning with Generative Adversarial Networks},
  author={Langr, Jakub and Bok, Vladimir},
  year={2019},
  publisher={Manning Publications}
}

πŸ“„ License

This code is provided under the MIT License. See the LICENSE file for details.

🀝 Contributing

We welcome contributions to improve this repository! If you find any issues, have suggestions for improvements, or want to add more comprehensive examples, please feel free to open an issue or submit a pull request.

πŸ’¬ Contact


Made with ❀️ by the GANs in Action team

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •