Skip to content

Nikhil172913832/Federated_Learning

Repository files navigation

Federated Learning Platform

Production-ready federated learning platform with privacy-preserving distributed training, communication efficiency, and Byzantine robustness.

Features

Algorithms

  • Hybrid gradient compression (20-50x ratio)
  • Byzantine-robust aggregation (Multi-Krum, Trimmed Mean)
  • Differential privacy (DP-SGD)
  • Membership inference attack validation

Infrastructure

  • Kubernetes deployment with Helm
  • CI/CD pipeline with GitHub Actions
  • Prometheus + Grafana monitoring
  • MLflow experiment tracking

Quick Start

./launch-platform.sh

Access:

Core Structure

complete/fl/
├── fl/
│   ├── task.py          # Training loop
│   ├── server_app.py    # Server aggregation
│   ├── client_app.py    # Client training
│   ├── compression.py   # Gradient compression
│   ├── robust_aggregation.py  # Byzantine robustness
│   └── privacy/         # Privacy validation
├── config/
│   └── default.yaml     # Configuration
└── tests/
    └── test_*.py        # Test suite

Configuration

Edit complete/fl/config/default.yaml:

topology:
  num_clients: 10
  fraction: 0.5

train:
  lr: 0.01
  local_epochs: 1
  num_server_rounds: 10

data:
  dataset: "albertvillanova/medmnist-v2"
  subset: "pneumoniamnist"
  batch_size: 32

privacy:
  dp_sgd:
    enabled: true
    noise_multiplier: 0.8
    target_epsilon: 3.0

Development

Local setup:

cd complete/fl
pip install -e ".[dev]"
pytest tests/ -v --cov=fl
flwr run . local-simulation --stream

Docker:

./launch-platform.sh
docker compose -f complete/compose-with-ui.yml down

Documentation

Testing

cd complete/fl
pytest tests/ -v --cov=fl --cov-report=html

License

MIT

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors