Skip to content

Conversation

@Dw9
Copy link
Contributor

@Dw9 Dw9 commented Jan 16, 2026

If showModelInfo() fails for any model, the entire Promise.all would fail, returning an empty array.

Failed to show Ollama model info for gpt-oss:20b: template: :3: function "currentDate" not defined
09:38:19.378 › Failed to get info for model gpt-oss:20b, using defaults: template: :3: function "currentDate" not defined

image

Now it returns the model with default info on error.
image

Summary by CodeRabbit

Bug Fixes

  • Improved error handling for Ollama model information retrieval with automatic fallback defaults, ensuring uninterrupted service when model details are unavailable.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 16, 2026

📝 Walkthrough

Walkthrough

The Ollama provider's attachModelInfo function is enhanced with try/catch error handling. When showModelInfo succeeds, model information is extracted as before. On failure, the function returns a model with default values (context_length: 4096, embedding_length: 512, capabilities: ['chat']) and logs a warning.

Changes

Cohort / File(s) Summary
Error Handling Enhancement
src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
Added try/catch wrapper around attachModelInfo to gracefully handle showModelInfo failures with sensible defaults (context_length: 4096, embedding_length: 512, capabilities: ['chat']) and warning logging

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Suggested reviewers

  • zerob13

Poem

🐰 When Ollama stumbles and falters its way,
A rabbit's default values save the day,
With try and with catch, we're graceful and kind,
Four thousand contexts? That's peace of mind! 🌿✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: handling showModelInfo errors to prevent empty model lists, which directly matches the PR's core objective of preventing Promise.all rejection from resulting in an empty model array.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings


📜 Recent review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5421dce and c99fe05.

📒 Files selected for processing (1)
  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
🧰 Additional context used
📓 Path-based instructions (10)
**/*.{js,ts,tsx,jsx,vue,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

All logs and comments must be in English

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
**/*.{js,ts,tsx,jsx,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

Use OxLint as the linter

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
**/*.{js,ts,tsx,jsx,vue,json,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

Use Prettier as the code formatter

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
src/main/**/*.ts

📄 CodeRabbit inference engine (AGENTS.md)

Electron main process code should reside in src/main/, with presenters organized in presenter/ subdirectory (Window, Tab, Thread, Mcp, Config, LLMProvider), and app events managed via eventbus.ts

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (AGENTS.md)

**/*.{ts,tsx,vue}: Use camelCase for variable and function names; use PascalCase for types and classes; use SCREAMING_SNAKE_CASE for constants
Configure Prettier with single quotes, no semicolons, and line width of 100 characters. Run pnpm run format after completing features

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

Use OxLint for linting JavaScript and TypeScript files; ensure lint-staged hooks and typecheck pass before commits

Enable strict TypeScript type checking

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
**/*.{ts,tsx,js,jsx,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Use English for all logs and comments

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
src/main/presenter/**/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

Implement all system capabilities as main-process presenters following the Presenter Pattern

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
src/main/presenter/llmProviderPresenter/providers/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

Implement coreStream method following standardized event interface for LLM providers

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
{src/main,src/renderer,test}/**/*.{ts,tsx,js}

📄 CodeRabbit inference engine (CLAUDE.md)

Use IPC communication: Renderer calls main process via usePresenter composable, Main sends to Renderer via EventBus

Files:

  • src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)
🔇 Additional comments (1)
src/main/presenter/llmProviderPresenter/providers/ollamaProvider.ts (1)

325-357: Good fix for preventing Promise.all rejection on individual model failures.

The try/catch pattern correctly handles showModelInfo errors, ensuring that a single failing model doesn't cause the entire model list to be empty.

Minor suggestion: Line 329 accesses model.details.family without optional chaining, while line 73 uses model.details?.family. For consistency and to handle edge cases where details might be undefined, consider using optional chaining here as well.

🔧 Optional improvement for consistency
  try {
    const showResponse = await this.showModelInfo(model.name)
    const info = showResponse.model_info
-   const family = model.details.family
+   const family = model.details?.family
    const context_length = info?.[family + '.context_length'] ?? 4096
    const embedding_length = info?.[family + '.embedding_length'] ?? 512
    const capabilities = showResponse.capabilities ?? ['chat']

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@zerob13 zerob13 merged commit bbb7166 into ThinkInAIXYZ:dev Jan 16, 2026
2 checks passed
@Dw9 Dw9 deleted the fix-ollama-showmodel branch January 16, 2026 02:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants