Skip to content

Releases: jonigl/mcp-client-for-ollama

Release v0.18.1

22 Aug 11:06
e635364
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.18.0...v0.18.1

Release v0.18.0

14 Aug 14:14
4db4349
Compare
Choose a tag to compare

What's Changed

  • refactor: add constant for mcp protocol version by @jonigl in #67
  • feat: use compatible release (~=) for dependencies by @jonigl in #70
  • feat: add additional exit commands to client interface by @gnzng in #71
  • feat: checking SSE and Streamable HTTP mcp servers connectivity before e starting sessions by @jonigl in #74
  • feat: adding num_ctx to set the size of the model context window by @jonigl in #75
  • chore: bump version to 0.18.0 by @jonigl in #76

New Contributors

  • @gnzng made their first contribution in #71

Full Changelog: v0.17.0...v0.18.0

Release v0.17.0

06 Aug 11:38
4050078
Compare
Choose a tag to compare

What's Changed

  • refactor: replace hardcoded thinking models with dynamic capability detection by @jonigl in #65

Full Changelog: v0.16.0...v0.17.0

Release v0.16.0

26 Jul 12:43
bb9215c
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.15.1...v0.16.0

Release v0.15.1

19 Jul 00:01
5159293
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.15.0...v0.15.1

Release v0.15.0

18 Jul 17:47
94dfc03
Compare
Choose a tag to compare

What's Changed

  • docs: fix order for tools and models screenshots by @jonigl in #47
  • feat: upgrading libraries. Particularly mcp 1.12.0 by @jonigl in #48
  • feat: add --mcp-server-url option to support direct URL connections to SSE/HTTP MCP servers by @jonigl in #49
  • feat: adding new option on tool selection to view tool JSON for debugging purposes by @jonigl in #50
  • feat: add shorter flag options by @jonigl in #51
  • docs: updating main README.md using svg instead of gif for demo animation by @jonigl in #52
  • Bump version to 0.15.0 by @jonigl in #53

Full Changelog: v0.14.0...v0.15.0

Release v0.14.0

05 Jul 00:01
b41b807
Compare
Choose a tag to compare

What's Changed

  • ci: using generate_release_notes instead of creating the body by @jonigl in #38
  • feat: upgrade to MCP 1.10.1 with streamable HTTP transport by @jonigl in #41

Full Changelog: v0.13.0...v0.14.0

Release v0.13.0

28 Jun 18:28
9e670db
Compare
Choose a tag to compare

v0.13.0 – Typer CLI Migration, Enhanced Interactive UX, and CI Improvements

🚀 Highlights

This release delivers a modernized CLI powered by Typer, refined interactive model configuration, fuzzy autocomplete for better user input, and streamlined CI. It also updates the principal demo GIF in the README for improved clarity and presentation.

🔧 Changes

  • CLI Overhaul with Typer

    • Migrated from argparse to Typer for a more user-friendly and modern CLI experience.

    • Grouped CLI options for clearer help output:

      • MCP Server: --mcp-server, --servers-json, --auto-discovery
      • Ollama Model: --model, --host
    • Left general flags (e.g. --version) ungrouped for simplicity.

  • Interactive Configuration Improvements

    • Improved prompts and validation for each parameter.
    • Prompts now dynamically display current model, thinking mode status, and tool count.
    • Enhanced user feedback and guidance during setup.
  • Fuzzy Autocompletion & UX Enhancements

    • Added FZFStyleCompleter using WordCompleter + FuzzyCompleter for intuitive fuzzy matching.
    • Arrow indicator () highlights best autocomplete match.
    • Centralized prompt logic and completer styles in constants.py.
    • Enabled ignore_case matching for better experience.
  • Documentation & Visuals

    • Updated README.md to reflect CLI and UX changes.
    • Replaced main demo GIF with an updated version showcasing the new interface.
  • CI/CD Enhancements

    • Updated GitHub Actions:

      • action-gh-release from v1 to v2 with generate_release_notes enabled.
      • setup-python from v4 to v5 for better compatibility and performance.

📦 Version bump to v0.13.0 to capture CLI, UX, and CI evolution.

👨‍💻 Demo

asciicast

Release v0.12.0

27 Jun 20:44
f941bc6
Compare
Choose a tag to compare

v0.12.0 – Advanced Model & System-Prompt Configuration

🚀 Highlights

This release turns model tuning into a first-class experience. You can now interactively explore, validate, and persist all 13 Ollama generation parameters—with smart defaults, one-keystroke shortcuts, and an undo stack to keep experimentation friction-free.

🔧 Key Changes

Area What’s New
Full-stack parameter control Every Ollama option (system prompt, sampling, penalties, etc.) surfaced via a new model-config (mc) CLI command. Only explicitly set values are sent—keeping your payload lean.
System prompt control Define model behavior and persona independently from generation parameters, giving you more expressive control over tone, role, and intent.
Interactive help system Hover or “?” on any parameter to pop a scrollable guide: ranges, examples, and practical tips for typical vs. edge-case tuning.
Quick ops & undo u1u13 (unset single) and usp (unset system prompt) shortcuts, plus an undo command to roll back the last change.
Validation & smart defaults Built-in range checks and sensible starting values prevent invalid configs from ever leaving your terminal.
Persistence Model configs are now saved alongside conversations, so your favorite settings survive restarts.
Docs & UX polish README gains a ToC, extensive parameter docs, and example presets for common tasks (creative writing, code, concise QA, etc.).

🧩 Supported Parameters
num_keep, seed, num_predict, temperature, top_k, top_p, min_p, typical_p, repeat_last_n, repeat_penalty, presence_penalty, frequency_penalty, stop.

📚 Documentation Updates

  • Table of contents for faster navigation.
  • Step-by-step examples demonstrating the new mc flow and undo/quick-unset tricks.
  • Parameter reference guide with real-world use-case snippets.

🧑‍💻 Demo

asciicast


Enjoy finer-grained control, safer experimentation, and richer docs—all bundled into v0.12.0. 🎉


Packages

Installation

pip install mcp-client-for-ollama==0.12.0
# or
pip install ollmcp==0.11.0

If you already have it installed use

pip install --upgrade mcp-client-for-ollama
# or
pip install --upgrade ollmcp

Release v0.11.0

19 Jun 16:01
af2c0eb
Compare
Choose a tag to compare

v0.11.0 – Human-in-the-Loop (HIL) Confirmations for Safer Tool Execution

🚀 Highlights

This release introduces Human-in-the-Loop (HIL) confirmations for tool execution, providing users with greater control, safety, and transparency during automated operations. The system prompts for confirmation before executing tools, offering options to execute, skip, or disable confirmations per tool. This ensures users are always aware of potentially impactful actions.

🔧 Changes

  • Added HumanInTheLoopManager to manage execution confirmations
  • Implemented interactive HIL workflow with execute/skip/disable options
  • Introduced human-in-loop and hil CLI commands to control confirmation behavior
  • Persisted HIL settings across sessions via config save/load
  • Displayed current HIL status in the context info panel
  • Enhanced confirmation prompts with detailed tool information
  • Enabled HIL by default to prioritize safe operations
  • Updated help documentation and README to include HIL usage and commands
  • Updated mcp dependency from 1.9.3 to 1.9.4
  • Bumped version to v0.11.0

🛡️ Safety First

With HIL enabled by default, users are less likely to trigger unintended tool actions, making the system safer and more predictable.

🎥 Demo

Packages

Installation

pip install mcp-client-for-ollama==0.11.0
# or
pip install ollmcp==0.11.0

If you already have it installed use

pip install --upgrade mcp-client-for-ollama
# or
pip install --upgrade ollmcp