Skip to content

Commit 57e43f1

Browse files
committed
Create README.md
1 parent 8bb345d commit 57e43f1

File tree

1 file changed

+179
-0
lines changed

1 file changed

+179
-0
lines changed
Lines changed: 179 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,179 @@
1+
# Function Minimization Example
2+
3+
This example demonstrates how OpenEvolve can discover sophisticated optimization algorithms starting from a simple implementation.
4+
5+
## Problem Description
6+
7+
The task is to minimize a complex non-convex function with multiple local minima:
8+
9+
```python
10+
f(x, y) = sin(x) * cos(y) + sin(x*y) + (x^2 + y^2)/20
11+
```
12+
13+
The global minimum is approximately at (-1.704, 0.678) with a value of -1.519.
14+
15+
## Getting Started
16+
17+
To run this example:
18+
19+
```bash
20+
cd examples/function_minimization
21+
python ../../openevolve-run.py initial_program.py evaluator.py --config config.yaml
22+
```
23+
24+
## Algorithm Evolution
25+
26+
### Initial Algorithm (Random Search)
27+
28+
The initial implementation was a simple random search that had no memory between iterations:
29+
30+
```python
31+
def search_algorithm(iterations=1000, bounds=(-5, 5)):
32+
"""
33+
A simple random search algorithm that often gets stuck in local minima.
34+
35+
Args:
36+
iterations: Number of iterations to run
37+
bounds: Bounds for the search space (min, max)
38+
39+
Returns:
40+
Tuple of (best_x, best_y, best_value)
41+
"""
42+
# Initialize with a random point
43+
best_x = np.random.uniform(bounds[0], bounds[1])
44+
best_y = np.random.uniform(bounds[0], bounds[1])
45+
best_value = evaluate_function(best_x, best_y)
46+
47+
for _ in range(iterations):
48+
# Simple random search
49+
x = np.random.uniform(bounds[0], bounds[1])
50+
y = np.random.uniform(bounds[0], bounds[1])
51+
value = evaluate_function(x, y)
52+
53+
if value < best_value:
54+
best_value = value
55+
best_x, best_y = x, y
56+
57+
return best_x, best_y, best_value
58+
```
59+
60+
### Evolved Algorithm (Simulated Annealing)
61+
62+
After running OpenEvolve, it discovered a simulated annealing algorithm with a completely different approach:
63+
64+
```python
65+
def simulated_annealing(bounds=(-5, 5), iterations=1000, step_size=0.1, initial_temperature=100, cooling_rate=0.99):
66+
"""
67+
Simulated Annealing algorithm for function minimization.
68+
69+
Args:
70+
bounds: Bounds for the search space (min, max)
71+
iterations: Number of iterations to run
72+
step_size: Step size for perturbing the solution
73+
initial_temperature: Initial temperature for the simulated annealing process
74+
cooling_rate: Cooling rate for the simulated annealing process
75+
76+
Returns:
77+
Tuple of (best_x, best_y, best_value)
78+
"""
79+
# Initialize with a random point
80+
best_x = np.random.uniform(bounds[0], bounds[1])
81+
best_y = np.random.uniform(bounds[0], bounds[1])
82+
best_value = evaluate_function(best_x, best_y)
83+
84+
current_x, current_y = best_x, best_y
85+
current_value = best_value
86+
temperature = initial_temperature
87+
88+
for _ in range(iterations):
89+
# Perturb the current solution
90+
new_x = current_x + np.random.uniform(-step_size, step_size)
91+
new_y = current_y + np.random.uniform(-step_size, step_size)
92+
93+
# Ensure the new solution is within bounds
94+
new_x = max(bounds[0], min(new_x, bounds[1]))
95+
new_y = max(bounds[0], min(new_y, bounds[1]))
96+
97+
new_value = evaluate_function(new_x, new_y)
98+
99+
# Calculate the acceptance probability
100+
if new_value < current_value:
101+
current_x, current_y = new_x, new_y
102+
current_value = new_value
103+
104+
if new_value < best_value:
105+
best_x, best_y = new_x, new_y
106+
best_value = new_value
107+
else:
108+
probability = np.exp((current_value - new_value) / temperature)
109+
if np.random.rand() < probability:
110+
current_x, current_y = new_x, new_y
111+
current_value = new_value
112+
113+
# Cool down the temperature
114+
temperature *= cooling_rate
115+
116+
return best_x, best_y, best_value
117+
```
118+
119+
## Key Improvements
120+
121+
Through evolutionary iterations, OpenEvolve discovered several key algorithmic concepts:
122+
123+
1. **Local Search**: Instead of random sampling across the entire space, the evolved algorithm makes small perturbations to promising solutions:
124+
```python
125+
new_x = current_x + np.random.uniform(-step_size, step_size)
126+
new_y = current_y + np.random.uniform(-step_size, step_size)
127+
```
128+
129+
2. **Temperature-based Acceptance**: The algorithm can escape local minima by occasionally accepting worse solutions:
130+
```python
131+
probability = np.exp((current_value - new_value) / temperature)
132+
if np.random.rand() < probability:
133+
current_x, current_y = new_x, new_y
134+
current_value = new_value
135+
```
136+
137+
3. **Cooling Schedule**: The temperature gradually decreases, transitioning from exploration to exploitation:
138+
```python
139+
temperature *= cooling_rate
140+
```
141+
142+
4. **Parameter Introduction**: The system discovered the need for additional parameters to control the algorithm's behavior:
143+
```python
144+
def simulated_annealing(bounds=(-5, 5), iterations=1000, step_size=0.1, initial_temperature=100, cooling_rate=0.99):
145+
```
146+
147+
## Results
148+
149+
The evolved algorithm shows substantial improvement in finding better solutions:
150+
151+
| Metric | Value |
152+
|--------|-------|
153+
| Value Score | 0.677 |
154+
| Distance Score | 0.258 |
155+
| Reliability Score | 1.000 |
156+
| Overall Score | 0.917 |
157+
| Combined Score | 0.584 |
158+
159+
The simulated annealing algorithm:
160+
- Achieves higher quality solutions (closer to the global minimum)
161+
- Has perfect reliability (100% success rate in completing runs)
162+
- Maintains a good balance between performance and reliability
163+
164+
## How It Works
165+
166+
This example demonstrates key features of OpenEvolve:
167+
168+
- **Code Evolution**: Only the code inside the evolve blocks is modified
169+
- **Complete Algorithm Redesign**: The system transformed a random search into a completely different algorithm
170+
- **Automatic Discovery**: The system discovered simulated annealing without being explicitly programmed with knowledge of optimization algorithms
171+
- **Function Renaming**: The system even recognized that the algorithm should have a more descriptive name
172+
173+
## Next Steps
174+
175+
Try modifying the config.yaml file to:
176+
- Increase the number of iterations
177+
- Change the LLM model configuration
178+
- Adjust the evaluator settings to prioritize different metrics
179+
- Try a different objective function by modifying `evaluate_function()`

0 commit comments

Comments
 (0)