Skip to content

Challenger-XJTU/HiFo-Prompt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

26 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

HiFo-Prompt

Prompting with Hindsight and Foresight for LLM-based Automatic Heuristic Design

🧠 Hindsight Insight Pool Β· πŸ”­ Foresight Evolutionary Navigator Β· πŸ”„ Closed-Loop Evolution

License Python Release

English Β· δΈ­ζ–‡ (Chinese)


Welcome! This repository provides the code implementation for the paper HiFo-Prompt: Prompting with Hindsight and Foresight for LLM-based Automatic Heuristic Design.

πŸ“’ News

  • Jan. 2026: πŸŽ‰πŸŽ‰ HiFo-Prompt: Prompting with Hindsight and Foresight for LLM-based Automatic Heuristic Design has been accepted at ICLR 2026 as a Poster!

πŸ“– Introduction

The framework of HiFo-Prompt

HiFo-Prompt (Hindsight-Foresight Prompt) is a novel framework for Automatic Heuristic Design (AHD) that synergizes Large Language Models (LLMs) with Evolutionary Computation (EC).

Existing LLM-based methods often suffer from short-term memory (forgetting successful tricks) and lack of direction (randomly searching without a strategy). HiFo-Prompt solves this by introducing two key mechanisms:

  • 🧠 Hindsight (The Insight Pool): A self-evolving knowledge base that distills and stores "design principles" from high-performing heuristics, preventing the system from reinventing the wheel.

  • πŸ”­ Foresight (The Evolutionary Navigator): A meta-controller that monitors population dynamics (stagnation, diversity) and actively switches search regimes (Explore, Exploit, or Balance) via specific Design Directives.


Dynamic prompt generation process of HiFo-Prompt

πŸ”₯ Key Features

Component Function Why it matters
Insight Pool Extracts & Reuses Knowledge Instead of discarding parents, we extract why they worked. The prompts are augmented with proven "Insights".
Evolutionary Navigator Adaptive Control Detects if the search is stuck (stagnation) or too narrow (low diversity) and dynamically adjusts the prompt strategy.
Decoupled Evaluation Efficient Pipeline Decouples "Thought" from "Code", allowing for faster iteration and lower token consumption compared to standard methods.

πŸ› οΈ Installation

We recommend using Conda to manage the environment.

# 1. Create environment
conda create -n hifo python=3.10
conda activate hifo

# 2. Clone repository
git clone https://github.com/Challenger-XJTU/HiFo-Prompt.git
cd HiFo

# 3. Install dependencies
cd hifo
pip install .

πŸš€ Quick Start

Note: You must have an LLM API key (e.g., OpenAI, DeepSeek, Qwen) or a local LLM server running.

1. Basic Usage Structure

from hifo import hifo
from hifo.utils.getParas import Paras

# 1. Initialize Parameters
paras = Paras() 

# 2. Configure HiFo
paras.set_paras(
    method = "hifo",               
    problem = "tsp_construct",          # Problem: 'tsp_construct', 'bp_online'
    llm_api_endpoint = "api.deepseek.com", # Your API Endpoint
    llm_api_key = "sk-xxxxxxxx",        # Your API Key
    llm_model = "deepseek-chat",        # Model Name
    ec_pop_size = 8,                    # Population size
    ec_n_pop = 8,                      # Number of generations
    exp_n_proc = 4,                     # Parallel threads for evaluation
    exp_debug_mode = False              # Set True to see prompt construction details
)

# 3. Initialize & Run
evolution = hifo.EVOL(paras)
evolution.run()

2. Running Examples

We provide ready-to-run scripts for standard combinatorial optimization problems.

Traveling Salesman Problem (TSP)

Constructive heuristic design for TSP.

cd examples/tsp_construct
python runHiFo.py

Online Bin Packing (BPP)

Designing scoring functions for online packing.

cd examples/bp_online
python runHiFo.py

Custom Problem

cd examples/user_XXX
python runHiFo.py

βš™οΈ LLM Configuration

HiFo-Prompt supports both remote APIs and local LLM deployment.

Option A: Remote API (Recommended)

Supported protocols: OpenAI-compatible APIs (DeepSeek, Moonshot, ChatGPT, etc.).

Modify runHiFo.py:

llm_api_endpoint = "api.openai.com" 
llm_api_key = "your_key"
llm_model = "gpt-4o"

Option B: Local LLM (vLLM / HuggingFace)

  1. Start your local server (e.g., using vLLM):
python -m vllm.entrypoints.openai.api_server --model Qwen/Qwen2.5-7B-Instruct --port 8000
  1. Configure HiFo:
llm_use_local = True
llm_local_url = "http://localhost:8000/v1/chat/completions"

πŸ“‚ Project Structure

HiFo-Prompt/
β”œβ”€β”€ hifo/
β”‚   β”œβ”€β”€ src/hifo/
β”‚   β”‚   β”œβ”€β”€ methods/
β”‚   β”‚   β”‚   └── hifo/
β”‚   β”‚   β”‚       β”œβ”€β”€ hifo.py                   # Main HiFo Algorithm
β”‚   β”‚   β”‚       β”œβ”€β”€ hifo_evolution.py         # Evolution Operators (i1, e1, m1, etc.)
β”‚   β”‚   β”‚       β”œβ”€β”€ insight_pool.py           # 🧠 Hindsight Module
β”‚   β”‚   β”‚       └── evolutionary_navigator.py # πŸ”­ Foresight Module (Regime Control)
β”‚   β”‚   β”œβ”€β”€ llm/                              # LLM Interfaces
β”‚   β”‚   β”œβ”€β”€ problems/                         # Problem Definitions
β”‚   β”‚   └── utils/                            # Parameter parsing & helpers
β”‚   └── setup.py
β”œβ”€β”€ examples/                                 # Problem-specific runners (TSP, BPP, etc.)
└── docs/                                     # Documentation & Tutorials

πŸ“Š Configuration Options

Parameter Description Default
method Algorithm method (hifo, ael) hifo
problem Problem type tsp_construct
ec_pop_size Population size per generation 8
ec_n_pop Number of generations 8
exp_n_proc Number of parallel processes 4
eva_timeout Evaluation timeout (seconds) 300
exp_debug_mode Enable debug output False

πŸ“œ Citation & Community

We are actively maintaining HiFo-Prompt and thrilled to hear from the community!

  • Need Help? If you run into any bugs or have feature requests, please check the Issues page or submit a new one.
  • Collaboration: We are open to discussions on AHD and LLMs. Feel free to reach out via email or pull requests. Let's push the boundaries of automated algorithm design together! 🀝

Support Us: If HiFo-Prompt aids your research or if you like our approach, please Star ⭐ or Fork 🍴 this repository. Your support drives our updates!

@inproceedings{chen2026hifo,
  title     = {HiFo-Prompt: Prompting with Hindsight and Foresight for LLM-based Automatic Heuristic Design},
  author    = {Chen, Chentong and Zhong, Mengyuan and Fan, Ye and Shi, Jialong and Sun, Jianyong},
  booktitle = {International Conference on Learning Representations},
  year      = {2026},
  url       = {https://openreview.net/forum?id=imSLzfZ6av}
}

✨ Acknowledgments

Our work builds upon EoH and ReEvo, and we thank the authors for their inspiring work. We also acknowledge LLM4AD and FM4CO for their valuable learning resources on LLM-based AHD.

About

[ICLR 2026] HiFo-Prompt: Prompting with Hindsight and Foresight for LLM-based Automatic Heuristic Design

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages