-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Absolutely! Here's a detailed explanation of how you can implement an agent-like system in your gopilot library that supports multi-step function calling (a reasoning chain), written in clear and structured English:
π§ Multi-Step Function Calling Agent β Detailed Design
β Goal
Allow the LLM (e.g., Gemini or others) to reason through multiple steps and dynamically call multiple Go functions in sequence, based on natural language input.
π¬ Example Scenario
User prompt:
"Based on today's weather, what should I wear in Istanbul?"
Desired Function Call Chain:
GetWeather(city: "Istanbul")β returns weather dataSuggestClothes(weatherData)β returns clothing recommendation based on the weather
π§© Architecture Overview
1. Function Registry
A central registry that stores all available functions along with:
- Name
- Description
- Input schema
- Output schema (if needed)
This is already partially in your project. Extend it to support:
type Function interface {
Name() string
Description() string
ParametersSchema() map[string]interface{}
Execute(params map[string]interface{}) (interface{}, error)
}2. Agent Loop (Recursive / Iterative Reasoner)
After the LLM generates the first function call, your agent should:
- Run the function
- Append the result to the
conversation context - Ask the LLM: βWhat should I do next?β
- Repeat until the LLM replies with a
final answerorno more functions to call
π This is similar to how OpenAI function-calling agents or LangChain agents work.
3. Conversation Context
To maintain reasoning, you'll need a memory of previous steps:
type AgentContext struct {
Messages []LLMMessage // chat history
Results map[string]any // named function results
}At each step:
- Include latest result in the context (e.g.,
"weather": {"temp": 15, "condition": "rainy"}) - Let LLM decide what to do next
4. Prompt Template for LLM
Use a dynamic system prompt like this:
You are an intelligent agent that can call Go functions to solve problems.
Available functions:
- GetWeather(city: string): returns weather data
- SuggestClothes(weatherData: Weather): returns clothing suggestions
Your job is to analyze the user's request and use the functions step-by-step.
You must output structured function_call JSON in each step.You dynamically add previous results like this:
Previous result:
GetWeather("Istanbul") β { "temp": 12, "condition": "cloudy" }5. LLM Output Handling
Allow the LLM to respond with structured JSON like:
{
"function_call": {
"name": "SuggestClothes",
"arguments": {
"weatherData": {
"temp": 12,
"condition": "cloudy"
}
}
}
}Parse, route to the actual Go function, and repeat the loop.
π Example Loop Flow
- User: βWhat should I wear in Istanbul today?β
- LLM β Suggests calling
GetWeather(city: "Istanbul") - Agent β Calls function β gets
{temp: 12, condition: "cloudy"} - LLM β Uses result β suggests calling
SuggestClothes(weatherData) - Agent β Calls function β gets
"Wear a light jacket and waterproof shoes." - LLM β Replies with final answer
π Optional Enhancements
- Dependency Graph: Auto-detect which functions depend on others via parameter types.
- Chain Execution Mode: Let the agent plan all steps at once.
- Error Recovery: Let the LLM try a fallback if a function call fails.
- Memory: Save context across sessions.
Would you like a real Go code example that shows this in practice? I can write a prototype agent loop for you.