-
Notifications
You must be signed in to change notification settings - Fork 661
Description
Hi!
I am trying to run your function_minimization example to see whether everything is working with my local models.
I slightly changed the config.yaml of the example and configured it to use one local ollama model (currently Codellama:34b).
I keep getting the error "No valid diffs found in response". After closer inspection, it seems like the LLM simply doesn't answer in the required format in those instances. In the other iterations that do not generate an error it seems like evolution is working but when I look at the code in the Visualizer there is no diff between the programs. So in the end my best program is again my initial program. When looking at the prompt in the visualizer I see that the LLM suggested changes, but they were not applied. However, the evaluated metrics between programs do change.
I might be looking at the wrong things in the Visualizer, as I do not see any "diff", only the "code" tab per program.
Any help is greatly appreciated!