Skip to content

LDXu93/ACGO

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ACGO

ACGO: A GA framework guided by (q)Expected Improvement with continual surrogate updates

Team TurboS: Leidong Xu, Zhengkun Feng

University of Connecticut

A practical, looped optimization methodology that couples Gaussian-process (GP) surrogates with a genetic search (NSGA-II) and acquisition-guided batch selection. AGCO is designed for expensive simulators: it prioritizes feasible, high-value evaluations while continually retraining on newly acquired data.


Problem setting

  • Design variables (3): X Cell Size (mm), YZ Cell Size (mm), Velocity Inlet (mm/s)
  • Predicted properties (4): PressureDrop, AvgVelocity, Mass, Surface Area
  • Constraints:
    • Pressure drop: PressureDrop ≤ PD_MAX
    • Flow/throughput: AvgVelocity ≥ V_MIN
    • Fabrication/weight: Mass ≤ MASS_MAX
  • Objective: maximize Surface Area. Two common choices:
    1. Mean-only: maximize predicted mean Surface Area.
    2. UCB on standardized Surface Area: UCB = zA + κ * sA, where zA = (μ_A − mean_A)/std_A and sA = σ_A/std_A. (Standardization keeps exploration well-scaled; κ controls exploration.)

Design goal: Find feasible points with maximal Surface Area while keeping wasted simulator calls (infeasible runs) low.


Method overview

AGCO runs in outer iterations. Each iteration:

  1. Fit/update surrogate (GP):

    • Four heads (PressureDrop, AvgVelocity, Mass, Surface Area) via a batched SingleTaskGP (shared inputs, separate outputs).
    • Input Normalize and output Standardize transforms stabilize training.
    • Hyperparameters are warm-started from the previous iteration for stability.
  2. Conservative feasibility via calibrated margins:

    • From current training residuals, compute one-sided calibration offsets (conformal-style):
      • For upper-bounded targets (PressureDrop, Mass): Δ = q-quantile(truth − μ) (protect against under-prediction).
      • For lower-bounded target (AvgVelocity): Δ = q-quantile(μ − truth) (protect against over-prediction).
    • Robust gate per candidate:
      • PressureDrop: μ + κσ + Δ_PD ≤ PD_MAX
      • AvgVelocity: μ − κσ − Δ_V ≥ V_MIN
      • Mass: μ + κσ + Δ_M ≤ MASS_MAX
  3. Candidate generation (NSGA-II on surrogate):

    • Decision space: 3D bounds supplied by user.
    • Objective: mean or UCB on Surface Area (standardized if using UCB).
    • Constraints: robust GP inequalities above.
    • Multi-start (restarts) recommended to diversify fronts.
  4. Acquisition-guided scoring & selection:

    • Score surrogate-feasible candidates with qNEI/qEI (exploitation) and p(feasible) (safety).
    • Form a pre-pool: top-K exploit + top-K explore.
    • Diversity pruning in normalized space using a minimum L2 distance (or K-means) to avoid crowding.
  5. Final feasibility screen (again):

    • Re-apply the robust gate to pruned candidates (optionally estimate p(feasible) via Monte-Carlo draws from the GP posterior and require p ≥ pfeas_min).
  6. Run simulator + append data:

    • Evaluate the batch, append successful rows, de-duplicate on design variables, and refit the GP (warm-start).
  7. Repeat until budget is exhausted.


Flowchart

flowchart LR
  A[Start / Existing data] --> B[Fit GP surrogates<br/>(Normalize + Standardize, warm-start)]
  B --> C[Calibrate one-sided margins<br/>from training residuals]
  C --> D[NSGA-II on surrogate<br/>(Objective: mean/UCB on Surface Area;<br/>Constraints: robust μ±κσ±Δ)]
  D --> E[Acquisition scoring<br/>(qNEI/qEI, p(feasible))]
  E --> F[Diversity pruning<br/>(min-distance or K-means)]
  F --> G[Final robust + probabilistic gate]
  G --> H{Budget left?}
  H -- Yes --> I[Run simulator on selected batch]
  I --> J[Append new data, de-dup, warm-start GP]
  J --> B
  H -- No --> K[Stop / Report best feasible designs]
Loading

About

ACGO: A GA framework guided by (q)Expected Improvement with continual surrogate updates

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages