Centralize magic numbers into constants module#45
Conversation
Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This pull request establishes a foundation for incremental migration of magic numbers to a centralized constants module, addressing PLR2004 linting violations. The PR introduces opt/constants.py with 50+ well-documented constants and migrates 4 core files as a proof of concept.
Key Changes
- Created centralized constants module with comprehensive documentation covering population defaults, convergence thresholds, PSO parameters, gradient optimizer constants, and benchmark function bounds
- Migrated 4 core optimizer files (abstract_optimizer, particle_swarm, nadam, adamw) to use named constants instead of hardcoded values
- Added comprehensive test suite with 20 tests validating constant types, ranges, and mathematical relationships
Reviewed changes
Copilot reviewed 8 out of 8 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
| opt/constants.py | New centralized constants module with 50+ documented constants organized by category (population, convergence, PSO, gradient-based, etc.) |
| opt/test/test_constants.py | Comprehensive test suite validating constant types, ranges, and relationships |
| opt/CONSTANTS.md | Documentation covering usage patterns, migration status, and contributing guidelines |
| opt/abstract_optimizer.py | Migrated to use DEFAULT_MAX_ITERATIONS, DEFAULT_POPULATION_SIZE, DEFAULT_SEED, and POWER_THIRTY_TWO constants |
| opt/swarm_intelligence/particle_swarm.py | Migrated PSO-specific parameters to use PSO_INERTIA_WEIGHT, PSO_COGNITIVE_COEFFICIENT, and PSO_SOCIAL_COEFFICIENT |
| opt/gradient_based/nadam.py | Migrated to use ADAM_BETA1, ADAM_BETA2, ADAM_EPSILON, and NADAM_LEARNING_RATE constants |
| opt/gradient_based/adamw.py | Migrated to use Adam constants plus ADAMW_LEARNING_RATE and ADAMW_WEIGHT_DECAY |
| pyproject.toml | Added explanatory comments documenting the incremental migration strategy for PLR2004 |
opt/constants.py
Outdated
| ACKLEY_C = 2.0 | ||
| """Ackley function constant 'c' (typically 2*pi).""" |
There was a problem hiding this comment.
The ACKLEY_C constant is set to 2.0, but the docstring states it's "typically 2*pi". Looking at the actual Ackley function implementation in opt/benchmark/functions.py (line 59), it uses 2.0 * np.pi which equals approximately 6.283185307179586, not 2.0.
If this constant is intended to represent the 'c' parameter in the standard Ackley function formula, it should be:
- ACKLEY_C = 2.0 * np.pi (approximately 6.283185307179586)
Alternatively, if the intent is to have a separate constant for the multiplier (2.0) and use it as ACKLEY_C * np.pi, the docstring should be updated to clarify this usage pattern. However, this would be non-standard and potentially confusing.
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
The codebase had 381 PLR2004 violations (magic number comparisons) with values scattered across 100+ files. This creates a centralized constants module as foundation for incremental migration.
Changes
Created
opt/constants.pyMigrated 4 core files (proof of concept)
abstract_optimizer.py: DEFAULT_MAX_ITERATIONS, DEFAULT_POPULATION_SIZE, DEFAULT_SEEDparticle_swarm.py: PSO_INERTIA_WEIGHT, PSO_COGNITIVE_COEFFICIENT, PSO_SOCIAL_COEFFICIENTnadam.py: ADAM_BETA1, ADAM_BETA2, ADAM_EPSILON, NADAM_LEARNING_RATEadamw.py: All above plus ADAMW_LEARNING_RATE, ADAMW_WEIGHT_DECAYTest coverage
opt/test/test_constants.py: 20 tests validating types, ranges, mathematical relationshipsDocumentation
opt/CONSTANTS.md: Usage guide, migration status, contributing guidelinespyproject.toml: Added migration strategy comments explaining incremental approachExample
Before:
After:
Impact
Original prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.