Releases: PrasannaPulakurthi/Foreground-Background-Augmentation
Releases · PrasannaPulakurthi/Foreground-Background-Augmentation
Initial Public Release
Release v0.1.0 — Initial Public Release
Highlights
- First open release of Dual-Region Foreground–Background Augmentation (DRA)
- Two reference implementations:
SFDA/: Source-Free Domain Adaptation on PACS (Hydra-driven)Person_ReID/: Person re-identification baselines with DRA
- Augmentation utilities and visualization helpers
- Configurable training via CLI overrides
Repository Layout
SFDA/— core SFDA pipeline (main_win.py,configs/,datasets/PACS/* lists)Person_ReID/— training, testing, and augmentation for ReIDassets/— figures for READMESFDA/scripts_win/— example commands (Windows)
Install
- Python 3.8+
- SFDA:
pip install -r SFDA/requirements.txt(install PyTorch/torchvision per CUDA from pytorch.org) - ReID:
pip install -r Person_ReID/requirements.txt
Quickstart
- SFDA smoke test (no W&B, small batch, 1 epoch):
python SFDA/main_win.py learn.epochs=1 data.batch_size=16 use_wandb=false memo=release-smoke- Datasets: place PACS under
datasets/PACS/as described inSFDA/README.md
- ReID baseline (Market-1501 example):
- Train:
python Person_ReID/train.py --gpu_ids 0 --use_rn18 --batchsize 32 --data_dir Person_ReID/data/Market/pytorch - Test:
python Person_ReID/test.py --gpu_ids 0 --use_rn18 --test_dir Person_ReID/data/Market/pytorch
- Train:
Notes & Known Issues
- Windows scripts in
SFDA/scripts_win/, run them by pasting commands into your terminal. - Multi-GPU/distributed runs require CUDA devices; for single-GPU/CPU users, keep defaults conservative and reduce workers/batch size if needed.
- Mask generation (U^2-Net for SFDA; MediaPipe + SAM2 for ReID) is optional preprocessing; see subproject READMEs for instructions.
Licensing & Citation
- License: MIT (see
LICENSE) - Please cite the paper linked in the top-level
README.mdif this work helps your research.
Support
- Issues and questions: open a GitHub issue with environment, command, and logs. Include dataset layout if relevant.