Skip to content

[BUG] Codebase Search Tool Not Being Called with Local Ollama Setup #8430

@dcroswell

Description

@dcroswell

Problem (one or two sentences)

Issue:
The LLM never calls the codebase_search tool. Instead, it:

  • Guesses file locations based on directory structure
  • Hallucinates code content (generates fake FastAPI/Flask/Click code that doesn't exist)
  • Enters an infinite loop, asking follow-up questions
  • Treats codebase_search as a bash command when explicitly asked to use it

Additional Observations:
No "Roo: Index Codebase" or similar commands available in the command palette
Extension activates successfully, but doesn't appear in extension host activation logs

Context (who is affected and when)

Environment:

  • Roo Code version: 3.28.14
  • OS: Windows 11 with WSL2 (Ubuntu)
  • VSCode: Running in WSL remote
  • API Provider: Ollama (local)
  • Model: llama3.1:8b-instruct-q4_K_M
  • Embedder: Ollama all-minilm (384 dimensions)
  • The code gets indexed into Qdrant and shows in the extension
  • Vector DB: Qdrant (local Docker container)
  • Ollama endpoint: http://localhost:11434/
  • Qdrant endpoint: http://localhost:6333/

Verification Steps Completed:
✅ Qdrant has 4,561 indexed points in collection ws-dd9beb7f436a9121
✅ Ollama embedder works (generates correct 384-dim vectors)
✅ Direct Qdrant search returns correct results with proper file paths and code chunks
✅ WSL networking confirmed working (localhost resolves correctly)
✅ Extension shows as "Enabled on WSL: Ubuntu"

Reproduction steps

Configure Roo Code with local Ollama and Qdrant as described
Ask: "Find the main function in my codebase"
Observe LLM guessing instead of searching

Expected result

LLM should call the codebase_search tool and use the actual search results from Qdrant.

Actual result

LLM has no awareness of available tools. When asked "What tools do you have available?", it lists project files instead of its own tools. Extension host logs show no tool-related errors.

Variations tried (optional)

Tried different models for embedding and spent a while going over any configs that may be interfering with the extension. Indexing happens through the extension fine. The LLM seems to not get the results or know where to search even though the endpont for Qdrant was configured and tested.

App Version

3.28.14

API Provider (optional)

Ollama

Model Used (optional)

llama3.1:8b-instruct-q4_K_M

Roo Code Task Links (optional)

No response

Relevant logs or errors (optional)

The answers would be guesses, and then Roo Code goes into a loop of asking for more information.
After the hallucination of code it would display an API Request message containing:
[ERROR] You did not use a tool in your previous response! Please retry with a tool use.
# Reminder: Instructions for Tool Use
Tool uses are formatted using XML-style tags. The tool name itself becomes the XML tag name. Each parameter is enclosed within its own set of tags. Here's the structure:
<actual_tool_name>
<parameter1_name>value1</parameter1_name>
<parameter2_name>value2</parameter2_name>
...
</actual_tool_name>

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions