Skip to content

Commit 6c1b13b

Browse files
committed
Merge branch 'main' of github.com:codinglabsong/aging-gan
2 parents bde72ca + 06dfbd4 commit 6c1b13b

File tree

3 files changed

+12
-8
lines changed

3 files changed

+12
-8
lines changed

README.md

Lines changed: 11 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,16 @@
1-
# Aging GAN
2-
Aging GAN is an unpaired image-to-image translation project for facial age transformation built on a CycleGAN-style framework. Created with PyTorch, it provides end-to-end tooling—from data preprocessing to training, checkpointing, and a live Gradio demo—while offering infrastructure utilities to simplify training on AWS EC2 instances. The model trains two ResNet‑style "encoder-residual-decoder" generators and two PatchGAN discriminators on the UTKFace dataset, split into **Young** and **Old** subsets. The generators learn to translate between these domains, effectively "aging" or "de-aging" a face image.
1+
# Face-Aging CycleGAN
2+
Face-Aging CycleGAN is an unpaired image-to-image translation project for facial age transformation built on a CycleGAN-style framework. Created with PyTorch, it provides end-to-end tooling—from data preprocessing to training, checkpointing, and a live Gradio demo—while offering infrastructure utilities to simplify training on AWS EC2 instances. The model trains two ResNet‑style "encoder-residual-decoder" generators and two PatchGAN discriminators on the UTKFace dataset, split into **Young** and **Old** subsets. The generators learn to translate between these domains, effectively "aging" or "de-aging" a face image.
33

44
This repository contains training scripts, helper utilities, inference scripts, and a Gradio app for demo purposes.
55

6+
You may also read more about this project development from paper to application on my [blog post](https://codinglabsong.medium.com/face-aging-cyclegan-from-paper-to-application-ba22269549de).
7+
8+
## Motivation
9+
Face‑aging models aren't just "fun filters" on social apps, but address real needs across law enforcement, healthcare, and even entertainment. One of the most important applications is in forensic science and missing‑person investigations. When a child vanishes, every year that passes makes their appearance shift. Age‑progressed images generated by an aging GAN can help police and the public recognize what the child might look like today. Those same techniques assist in cold‑case searches for adults, helping law enforcement update outdated photographs to increase the chances of identification. In healthcare and dermatology, face‑aging networks can even simulate how a patient's skin is likely to evolve under various conditions. Different tasks would require different datasets and architectures, but you would be able to learn how a theory can convert to meaningful application through this project.
10+
611
## Features
712
- **CycleGAN Architecture** - ResNet‑style "encoder-residual-decoder" generators and PatchGAN discriminators. In addition to adversarial loss, cycle‑consistency loss was used to preserve content/structure. Moreover, identity loss was added to preserve color and style of the original image.
8-
- **Data Pipeline & Preprocessing** - deterministic train/val/test splits, on-the-fly augmentations, unpaired DataLoader that pairs Young (18–28) and Old (40+) faces at each batch.
13+
- **Data Pipeline & Preprocessing** - deterministic train/val/test splits, on-the-fly augmentations, unpaired DataLoader that pairs Young (18–28) and Middle-aged/Old (40+) faces at each batch.
914
- **Training Utilities & Efficiency** - gradient clipping to stabilize adversarial updates, separate generator/discriminator learning rates with linear decay for latter half of training, mixed precision via `accelerate` for 2× speed/memory improvements, and checkpointing models with per-epoch generated sample images for evaluation.
1015
- **Evaluation** - FID (Frechet Inception Distance) evaluation on validation and test splits.
1116
- **Weights & Biases Logging** - track losses and metrics during training.
@@ -99,6 +104,8 @@ These results are on the test set for the model checkpoint that received the low
99104
You can download this checkpoint model on [Releases](https://github.com/codinglabsong/aging-gan/releases/tag/v1.0.0).
100105

101106
### Example Outputs
107+
We can see the transition between the young and the middle-aged or old.
108+
102109
<table>
103110
<caption style="caption-side:top; font-weight:bold; text-align:center;">
104111
Young -> Old
@@ -196,4 +203,4 @@ Contributions are welcome! Feel free to open issues or submit pull requests.
196203
- [Old2Young Input 3 Image](https://www.iese.edu/standout/women-board-directors-keys-leadership/)
197204

198205
## License
199-
This project is licensed under the [MIT License](LICENSE).
206+
This project is licensed under the [MIT License](LICENSE).

src/aging_gan/model.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ def forward(self, x: Tensor) -> Tensor:
2929

3030

3131
class Generator(nn.Module):
32-
"""U-Net style generator used for domain translation."""
32+
"""ResNet‑style generator used for domain translation."""
3333

3434
def __init__(self, ngf: int, n_residual_blocks: int = 9) -> None:
3535
super().__init__()

src/aging_gan/train.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -138,9 +138,6 @@ def initialize_optimizers(
138138
cfg, G, F, DX, DY
139139
) -> tuple[optim.Optimizer, optim.Optimizer, optim.Optimizer, optim.Optimizer]:
140140
"""Create Adam optimizers for all models."""
141-
# track all generator params (even frozen encoder params during initial training).
142-
# This would allow us to transition easily to the full fine-tuning later on by simply toggling requires_grad=True
143-
# since the optimizers already track all the parameters from the start.
144141
opt_G = optim.Adam(
145142
G.parameters(),
146143
lr=cfg.gen_lr,

0 commit comments

Comments
 (0)