[WIP] Update docstrings for social-inspired algorithms#107
Conversation
Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
| - Randomly alternates between constituency and campaign phases | ||
| - Adaptive party switching probability decreases over time | ||
| - BBOB: Returns final best solution after max_iter |
There was a problem hiding this comment.
Fixed in commit 5f5aaac - corrected indentation to match Google Python Style Guide.
| Returns: | ||
| tuple[np.ndarray, float]: | ||
| Best solution found and its fitness value | ||
| tuple[np.ndarray, float]: | ||
| - best_solution (np.ndarray): Best solution found, shape (dim,) | ||
| - best_fitness (float): Fitness value at best_solution |
| Returns: | ||
| tuple[np.ndarray, float]: | ||
| Best solution found and its fitness value | ||
| tuple[np.ndarray, float]: | ||
| - best_solution (np.ndarray): Best solution found, shape (dim,) | ||
| - best_fitness (float): Fitness value at best_solution |
| - Each iteration simulates matches between weighted opponents | ||
| - Training phase (20% probability) adds exploration | ||
| - Transfer window (10% probability) enables dimension exchange | ||
| - BBOB: Returns final best solution after max_iter |
There was a problem hiding this comment.
Fixed in commit 5f5aaac - corrected indentation to match Google Python Style Guide.
| tuple[np.ndarray, float]: | ||
| - best_solution (np.ndarray): Best solution found, shape (dim,) | ||
| - best_fitness (float): Fitness value at best_solution |
| - Executes improving, acquiring, and introspection phases per iteration | ||
| - Self-introspection coefficient adapts linearly: $c \cdot (1 - t)$ | ||
| - Supports early stopping and convergence tracking | ||
| - BBOB: Returns final best solution after max_iter or early stop |
There was a problem hiding this comment.
Fixed in commit 5f5aaac - corrected indentation to match Google Python Style Guide.
| Returns: | ||
| tuple[np.ndarray, float]: | ||
| Best solution found and its fitness value | ||
| tuple[np.ndarray, float]: | ||
| - best_solution (np.ndarray): Best solution found, shape (dim,) | ||
| - best_fitness (float): Fitness value at best_solution |
| - Executes both teacher and learner phases in each iteration | ||
| - Uses greedy selection for accepting new solutions | ||
| - BBOB: Returns final best solution after max_iter |
There was a problem hiding this comment.
Fixed in commit 5f5aaac - corrected indentation to match Google Python Style Guide (8 spaces from base for Returns/Raises/Notes sections).
…tes sections Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
Fixed in commit 3bdd274. The linting issues were from the custom Google docstring validator (
|
Anselmoo
left a comment
There was a problem hiding this comment.
opt/social_inspired/teaching_learning.py: Args section should document 'seed' parameter for BBOB compliance
opt/social_inspired/teaching_learning.py: Attributes section should document 'seed' attribute (REQUIRED for BBOB)
opt/social_inspired/political_optimizer.py: Args section should document 'seed' parameter for BBOB compliance
opt/social_inspired/political_optimizer.py: Attributes section should document 'seed' attribute (REQUIRED for BBOB)
opt/social_inspired/soccer_league_optimizer.py: Args section should document 'seed' parameter for BBOB compliance
opt/social_inspired/soccer_league_optimizer.py: Attributes section should document 'seed' attribute (REQUIRED for BBOB)
Found 6 COCO/BBOB compliance issue(s)
opt/social_inspired/social_group_optimizer.py: Args section should document 'seed' parameter for BBOB compliance
opt/social_inspired/social_group_optimizer.py: Attributes section should document 'seed' attribute (REQUIRED for BBOB)
Found 2 COCO/BBOB compliance issue(s)
…ompliance Co-authored-by: Anselmoo <13209783+Anselmoo@users.noreply.github.com>
|
Fixed in commit aed1b35. Added
|
…tion in social algorithms
There was a problem hiding this comment.
Pull request overview
This PR updates docstrings for four social-inspired optimization algorithms to achieve COCO/BBOB compliance. The updates replace FIXME placeholders with complete algorithm metadata, mathematical formulations, hyperparameter documentation, and social behavior analogies. All four optimizers now include seed parameters for reproducibility and follow Google Python Style Guide conventions.
Key changes:
- Complete COCO/BBOB-compliant docstrings with algorithm metadata (year, authors, complexity)
- Mathematical formulations with LaTeX notation for all algorithm phases
- Social behavior analogies explaining the real-world metaphors (classroom, political elections, social groups, soccer leagues)
- Addition of seed parameter to init signatures for reproducibility
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
| opt/social_inspired/teaching_learning.py | Updated TLBO docstring with teacher/learner phase formulations, classroom learning analogy, and seed parameter. Fixed typo "unse" → "unseeded" needed. |
| opt/social_inspired/social_group_optimizer.py | Added complete SGO documentation with three-phase learning (improving, acquiring, introspection), social learning analogies, and seed parameter. |
| opt/social_inspired/soccer_league_optimizer.py | Documented SLC with match/training/transfer formulations, soccer league dynamics analogy, and seed parameter. |
| opt/social_inspired/political_optimizer.py | Added PO documentation with constituency/campaign/switching phases, political election analogy, and seed parameter. |
| seed: Random seed for reproducibility. | ||
| """ | ||
| super().__init__(func, lower_bound, upper_bound, dim) | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) |
There was a problem hiding this comment.
The super().init call should include the population_size parameter to properly initialize the parent class. The AbstractOptimizer accepts population_size as a parameter, and while you're setting self.population_size separately, it's better to pass it to the parent constructor for consistency. Consider changing to: super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed, population_size)
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) | |
| super().__init__( | |
| func, | |
| lower_bound, | |
| upper_bound, | |
| dim, | |
| max_iter, | |
| seed, | |
| population_size, | |
| ) |
| seed: Random seed for reproducibility. | ||
| """ | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter) | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) |
There was a problem hiding this comment.
The super().init call should include the population_size parameter. While you're setting self.population_size separately, AbstractOptimizer accepts population_size as a parameter and it should be passed through for consistency. Consider: super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed, population_size)
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) | |
| super().__init__( | |
| func, | |
| lower_bound, | |
| upper_bound, | |
| dim, | |
| max_iter, | |
| seed, | |
| population_size, | |
| ) |
| seed: Random seed for reproducibility. | ||
| """ | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter) | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) |
There was a problem hiding this comment.
The super().init call should include the population_size parameter for consistency with AbstractOptimizer's interface. Consider: super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed, population_size)
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) | |
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed, population_size) |
| seed: Random seed for reproducibility. | ||
| """ | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter) | ||
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) |
There was a problem hiding this comment.
The super().init call should include the population_size parameter for consistency with AbstractOptimizer's interface. Consider: super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed, population_size)
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed) | |
| super().__init__(func, lower_bound, upper_bound, dim, max_iter, seed, population_size) |
| **Reproducibility**: | ||
| - **Deterministic**: FIXME: [Yes/No] - Same seed guarantees same results | ||
| - **BBOB compliance**: seed parameter required for 15 independent runs | ||
| - **Deterministic**: No - uses unse random number generation |
There was a problem hiding this comment.
The phrase "uses unse random number generation" contains a typo. It should be "uses unseeded random number generation".
| - **Deterministic**: No - uses unse random number generation | |
| - **Deterministic**: No - uses unseeded random number generation |
uv run ruff check opt/social_inspired/Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.