You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/function_minimization/README.md
+27-11Lines changed: 27 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -137,17 +137,33 @@ def search_algorithm(bounds=(-5, 5), iterations=2000, initial_temperature=100, c
137
137
138
138
Through evolutionary iterations, OpenEvolve discovered several key algorithmic concepts:
139
139
140
-
1.**Memory and Exploitation**: The evolved algorithm tracks and updates the best solution seen so far, allowing for continual improvement rather than random restarting.
141
-
142
-
2.**Exploration via Temperature**: Simulated annealing uses a “temperature” parameter to allow uphill moves early in the search, helping escape local minima that would trap simpler methods.
143
-
144
-
3.**Adaptive Step Size**: The step size is adjusted dynamically—shrinking as the search converges and expanding if progress stalls—leading to better coverage and faster convergence.
145
-
146
-
4.**Bounded Moves**: The algorithm ensures all candidate solutions remain within the feasible domain, avoiding wasted evaluations.
147
-
148
-
5.**Stagnation Handling**: By counting iterations without improvement, the algorithm responds by boosting exploration when progress stalls.
149
-
150
-
6.**Probabilistic Acceptance**: Moves to worse solutions are allowed with a probability that decays over time, providing a principled way to balance exploration and exploitation.
140
+
1.**Exploration via Temperature**: Simulated annealing uses a `temperature` parameter to allow uphill moves early in the search, helping escape local minima that would trap simpler methods.
141
+
```python
142
+
probability = np.exp((current_value - new_value) / temperature)
143
+
```
144
+
145
+
2. **Adaptive Step Size**: The step size is adjusted dynamically—shrinking as the search converges and expanding if progress stalls—leading to better coverage and faster convergence.
146
+
```python
147
+
if i > iterations *0.75: # Reduce step size towards the end
148
+
step_size *=0.5
149
+
if no_improvement_count > step_size_increase_threshold: # Increase step size if stuck
150
+
step_size *=1.1
151
+
no_improvement_count =0# Reset the counter
152
+
```
153
+
154
+
3. **Bounded Moves**: The algorithm ensures all candidate solutions remain within the feasible domain, avoiding wasted evaluations.
155
+
```python
156
+
# Keep the new points within the bounds
157
+
new_x =max(bounds[0], min(new_x, bounds[1]))
158
+
new_y =max(bounds[0], min(new_y, bounds[1]))
159
+
```
160
+
161
+
4. **Stagnation Handling**: By counting iterations without improvement, the algorithm responds by boosting exploration when progress stalls.
162
+
```python
163
+
if no_improvement_count > step_size_increase_threshold: # Increase step size if stuck
0 commit comments