Skip to content

Conversation

@edenreich
Copy link
Contributor

Implements background mode execution for prompts that can:

  • Parse GitHub issue references in multiple formats
  • Fetch GitHub issue context using GitHub API
  • Execute prompts iteratively until task completion
  • Support up to 10 iterations with completion detection

Closes #22

Generated with Claude Code

@edenreich
Copy link
Contributor Author

@claude Please fix the CI - I've fast-forward this branch from main upstream.

@claude
Copy link
Contributor

claude bot commented Aug 14, 2025

Claude finished @edenreich's task —— View job


CI Issues Fixed! ✅

Successfully resolved the CI type error and committed the fix to the PR branch.

Todo List:

  • Check CI status and identify failures
  • Set up development environment
  • Examine the type error in cmd/prompt.go:294
  • Fix the type error using the FormatToolResultForLLM function
  • Run local quality checks to verify fix
  • Commit fixes to the PR branch

Issue Fixed: The code was trying to return a *domain.ToolExecutionResult as a string. I added the missing ui import and changed the return statement to use ui.FormatToolResultForLLM(result) to properly convert the result to a string.

Changes Made:

  • Added import for "github.com/inference-gateway/cli/internal/ui"
  • Changed return result, nil to return ui.FormatToolResultForLLM(result), nil

Verification:

  • ✅ Linter passes (0 issues)
  • ✅ Tests pass
  • ✅ Build succeeds

Commit: ba6c21c - "fix: Resolve type error in executeToolCall function"

The CI should now pass on the next run. The fix ensures that tool execution results are properly formatted as strings for LLM consumption, which maintains the expected interface contract.

claude bot and others added 4 commits August 22, 2025 00:57
Implement background mode execution for prompts that can:
- Parse GitHub issue references in multiple formats (#123, owner/repo#123, URLs)
- Fetch GitHub issue context using GitHub API
- Execute prompts iteratively until task completion
- Support up to 10 iterations with completion detection
- Integrate with existing model selection and chat services

The prompt command enables autonomous task solving with commands like:
- infer prompt "Please fix the github issue #123"
- infer prompt "Please fix the github owner/repo#456"
- infer prompt "Optimize the database queries"

Co-authored-by: Eden Reich <[email protected]>
Signed-off-by: Eden Reich <[email protected]>
Keep the CLI agentic, it should decide if it wants to use fetch to get details about the issue.

Signed-off-by: Eden Reich <[email protected]>
Convert ToolExecutionResult to string using ui.FormatToolResultForLLM() instead of
returning the struct directly. This fixes the typecheck error in the linter.

Co-authored-by: Eden Reich <[email protected]>
@edenreich edenreich force-pushed the claude/issue-22-20250812-2306 branch from d731625 to ad627da Compare August 21, 2025 22:58
@edenreich edenreich closed this Aug 21, 2025
@edenreich edenreich deleted the claude/issue-22-20250812-2306 branch August 21, 2025 23:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE] Implement One-off prompt (Background Mode)

2 participants