-
Notifications
You must be signed in to change notification settings - Fork 2.6k
fix: ensure AI uses tools before asking user questions #3879
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: ensure AI uses tools before asking user questions #3879
Conversation
hannesrudolph
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review: Fix - Ensure AI uses tools before asking user questions
Overview
This PR addresses a UX issue where the AI was asking users for information (like file paths) that it could have found itself using available tools. The fix adds explicit instructions to the ask_followup_question tool to prioritize using available tools before asking users for information.
✅ Strengths
Problem Identification:
- Clearly addresses a real UX pain point where AI appears less capable than it actually is
- Well-documented issue (#3873) that this PR fixes
- Good understanding of the root cause: insufficient guidance in tool descriptions
Implementation Approach:
- Minimal, targeted change that addresses the core issue
- Uses system prompt modifications rather than code logic changes
- Maintains backward compatibility while improving behavior
Documentation:
- Clear, structured instructions with numbered priorities
- Specific examples of tools to use (
search_files,list_files) - Good balance between being directive and maintaining flexibility
🔍 Technical Implementation
Changes Made:
- ask-followup-question.ts: Added 3 key instructions to the tool description
- system.test.ts.snap: Updated all snapshot tests to reflect the new prompt
Instruction Hierarchy:
1. MUST FIRST use available tools to locate information
2. ONLY request user details when tools cannot retrieve them
3. PRIORITIZE AI's own knowledge, ask only when human consideration needed
✅ Code Quality Assessment
Prompt Engineering:
- Well-structured with clear priority ordering
- Uses emphatic language (
YOU MUST,IMPORTANT) to ensure compliance - Provides concrete examples (
eg, search_files, list_files) - Maintains original tool purpose while adding constraints
Test Coverage:
- ✅ All snapshot tests updated consistently
- ✅ Tests will catch future regressions in prompt changes
- ✅ No functional tests needed as this is prompt-based behavior change
Maintainability:
- Single source of truth for the tool description
- Changes propagate automatically to all system prompt variants
- Easy to modify or extend in the future
🔍 Behavioral Impact Analysis
Expected Improvements:
- Reduced unnecessary user interactions
- More autonomous AI behavior
- Better utilization of available tools
- Improved user perception of AI capability
Potential Considerations:
- AI might now spend more time searching before asking questions (generally positive)
- Could potentially delay legitimate questions that require user input
- May need monitoring to ensure AI doesn't get stuck in tool loops
⚠️ Minor Observations
-
Grammar/Style:
- Point 2 has a trailing space:
"them (eg, design specific considerations). " - Could be more concise while maintaining clarity
- Point 2 has a trailing space:
-
Examples:
- The
(eg, design specific considerations)example could be more specific - Consider adding more concrete examples of when user input IS appropriate
- The
-
Tool Coverage:
- Currently mentions
search_files, list_files- could include other relevant tools - Might benefit from
...to indicate this is not an exhaustive list
- Currently mentions
📋 Testing Strategy
Manual Testing Recommended:
- ✅ Ask AI to find files without exact paths
- ✅ Request information that exists in the codebase
- ✅ Verify AI still asks legitimate questions when needed
Areas to Monitor:
- Response time impact from additional tool usage
- Quality of questions when AI does ask them
- User satisfaction with reduced back-and-forth
Security & Performance
- ✅ No security implications
- ✅ Minimal performance impact (may increase tool usage, decrease user interactions)
- ✅ No external dependencies or breaking changes
Overall Assessment
LGTM ✅
This is an excellent improvement that addresses a real UX issue with a targeted, well-thought-out solution. The implementation:
- Solves the core problem effectively
- Uses proper prompt engineering techniques
- Maintains all existing functionality
- Has comprehensive test coverage
- Follows good software engineering practices
Impact: This change should significantly improve user experience by making the AI more autonomous and reducing unnecessary interruptions.
Risk: Very low - this is a prompt-only change that adds constraints rather than removing functionality.
The minor style suggestions above are not blockers for approval. This is a solid contribution that will meaningfully improve the AI's behavior.
|
I am good with these changes. |
This change adds explicit instructions to the ask_followup_question tool to ensure the AI first attempts to use available tools like search_files and list_files to locate information before asking the user questions. Fixes #3873 Signed-off-by: Eric Wheeler <[email protected]>
ee84a50 to
57f1090
Compare
|
@mrubens @hannesrudolph You gave this PR your seal of approval, can you take a look again? |
|
@KJ7LNW Closing this PR/issue as we're currently unconvinced that this addresses a clear issue due to insufficient data or linked examples demonstrating the problem. I understand how this might potentially be problematic in theory, but I haven't personally encountered it. If you can provide specific cloud links showing the issue clearly, please reopen, and we'll gladly reconsider. |
Context
The AI was sometimes asking users for information (like file paths) that it could have found itself using available tools like search_files or list_files. This created unnecessary back-and-forth with users and made the AI seem less capable than it actually is.
Implementation
This change adds explicit instructions to the ask_followup_question tool description to ensure the AI:
The changes were made to the ask-followup-question.ts file and the corresponding snapshot tests were updated.
Screenshots
N/A - This is a behavior change in the AI's prompting.
How to Test
Get in Touch
Discord: KJ7LNW
Fixes #3873
Important
Enhances AI behavior by ensuring it uses tools before asking users for information, with changes in
ask-followup-question.tsand updated snapshot tests.getAskFollowupQuestionDescription()inask-followup-question.tsto ensure AI uses tools likesearch_files,list_filesbefore asking users for information.system.test.ts.snapto reflect new behavior.This description was created by
for ee84a5020e5979c2b7bd32d22db7aa2526d4ae04. You can customize this summary. It will automatically update as commits are pushed.