diff --git a/.github/prompts/freshness-tier1.prompt.md b/.github/prompts/freshness-tier1.prompt.md index a26490a177dd..41a784ccb232 100644 --- a/.github/prompts/freshness-tier1.prompt.md +++ b/.github/prompts/freshness-tier1.prompt.md @@ -12,5 +12,6 @@ how fresh and up to date it is. Apply the following: 5. Ensure all the components formatted correctly. 6. Improve the SEO keywords. 7. If you find numbered lists, make sure their numbering only uses 1's. +8. Ensure each line is limited to 80 characters. Do your best and don't be lazy. \ No newline at end of file diff --git a/.github/prompts/freshness-tier2.prompt.md b/.github/prompts/freshness-tier2.prompt.md index e2935f12e4af..b106fb9e00eb 100644 --- a/.github/prompts/freshness-tier2.prompt.md +++ b/.github/prompts/freshness-tier2.prompt.md @@ -18,5 +18,6 @@ how fresh and up to date it is. Apply the following: 5. Try to add some helpful next steps to the end of the document, but only if there are no *Next steps* or *Related pages* section, already. 6. Try to clarify, shorten or improve the efficiency of some sentences. 7. Check for LLM readibility. +8. Ensure each line is limited to 80 characters. Do your best and don't be lazy. \ No newline at end of file diff --git a/content/includes/gordondhi.md b/content/includes/gordondhi.md index 909be24cd51d..af7dac8bbe41 100644 --- a/content/includes/gordondhi.md +++ b/content/includes/gordondhi.md @@ -1,5 +1,5 @@ 1. Ensure Gordon is [enabled](/manuals/ai/gordon.md#enable-ask-gordon). -1. In Gordon's Toolbox, ensure Gordon's [Developer MCP toolkit is enabled](/manuals/ai/gordon/mcp/built-in-tools.md#configuration). +1. In Gordon's Toolbox, ensure Gordon's [Developer MCP Toolkit is enabled](/manuals/ai/gordon/mcp/built-in-tools.md#configuration). 1. In the terminal, navigate to the directory containing your Dockerfile. 1. Start a conversation with Gordon: ```bash diff --git a/content/manuals/ai/gordon/_index.md b/content/manuals/ai/gordon/_index.md index 9b0f07a07b13..7cb8e931e850 100644 --- a/content/manuals/ai/gordon/_index.md +++ b/content/manuals/ai/gordon/_index.md @@ -1,6 +1,6 @@ --- title: Ask Gordon -description: Learn how to streamline your workflow with Docker's AI-powered assistant. +description: Streamline your workflow with Docker's AI-powered assistant in Docker Desktop and CLI. weight: 10 params: sidebar: @@ -8,7 +8,7 @@ params: color: blue text: Beta group: AI -aliases: +aliases: - /desktop/features/gordon/ --- @@ -37,7 +37,7 @@ Ask Gordon is not enabled by default, and is not production-ready. You may also encounter the term "Docker AI" as a broader reference to this technology. -> [!NOTE] +> [!NOTE] > > Ask Gordon is powered by Large Language Models (LLMs). Like all > LLM-based tools, its responses may sometimes be inaccurate. Always verify the @@ -45,8 +45,7 @@ reference to this technology. ### What data does Gordon access? -When you use Ask Gordon, the data it accesses depends on the context of your -query: +When you use Ask Gordon, the data it accesses depends on your query: - Local files: If you use the `docker ai` command, Ask Gordon can access files and directories in the current working directory where the command is @@ -57,19 +56,18 @@ query: registry. To provide accurate responses, Ask Gordon may send relevant files, directories, -or image metadata to the Gordon backend along with your query. This data -transfer occurs over the network but is never stored persistently or shared with -third parties. It is used exclusively to process your request and formulate a -response. For more information about privacy terms and conditions for Docker AI, -review [Gordon's Supplemental -Terms](https://www.docker.com/legal/docker-ai-supplemental-terms/). +or image metadata to the Gordon backend with your query. This data transfer +occurs over the network but is never stored persistently or shared with third +parties. It is used only to process your request and formulate a response. For +details about privacy terms and conditions for Docker AI, review [Gordon's +Supplemental Terms](https://www.docker.com/legal/docker-ai-supplemental-terms/). All data transferred is encrypted in transit. ### How your data is collected and used Docker collects anonymized data from your interactions with Ask Gordon to -enhance the service. This includes the following: +improve the service. This includes: - Your queries: Questions you ask Gordon. - Responses: Answers provided by Gordon. @@ -81,8 +79,8 @@ To ensure privacy and security: - Docker does not use this data to train AI models or share it with third parties. -By using Ask Gordon, you help improve Docker AI's reliability and accuracy, -making it more effective for all users. +By using Ask Gordon, you help improve Docker AI's reliability and accuracy for +everyone. If you have concerns about data collection or usage, you can [disable](#disable-ask-gordon) the feature at any time. @@ -90,36 +88,35 @@ If you have concerns about data collection or usage, you can ## Enable Ask Gordon 1. Sign in to your Docker account. -2. Navigate to the **Beta features** tab in settings. -3. Check the **Enable Docker AI** checkbox. +1. Go to the **Beta features** tab in settings. +1. Check the **Enable Docker AI** checkbox. - The Docker AI terms of service agreement is displayed. You must agree to the - terms before you can enable the feature. Review the terms and select **Accept - and enable** to continue. + The Docker AI terms of service agreement appears. You must agree to the terms + before you can enable the feature. Review the terms and select **Accept and + enable** to continue. -4. Select **Apply**. +1. Select **Apply**. > [!IMPORTANT] > -> For Docker Desktop versions 4.41 and earlier, this settings lived under the **Experimental features** tab on the **Features in development** page. +> For Docker Desktop versions 4.41 and earlier, this setting is under the +> **Experimental features** tab on the **Features in development** page. ## Using Ask Gordon You can access Gordon: - In Docker Desktop, in the **Ask Gordon** view. -- Via the Docker CLI, with the `docker ai` CLI command. +- In the Docker CLI, with the `docker ai` command. -Once you've enabled the Docker AI features, you'll also find references to **Ask -Gordon** in various other places throughout the Docker Desktop user interface. -Whenever you encounter a button with the **Sparkles** (✨) icon in the user -interface, you can use the button to get contextual support from Ask Gordon. +After you enable Docker AI features, you will also see **Ask Gordon** in other +places in Docker Desktop. Whenever you see a button with the **Sparkles** (✨) +icon, you can use it to get contextual support from Ask Gordon. ## Example workflows -Ask Gordon is a general-purpose AI assistant created to help you with all your -Docker-related tasks and workflows. If you need some inspiration, here are a few -ways things you can try: +Ask Gordon is a general-purpose AI assistant for Docker tasks and workflows. Here +are some things you can try: - [Troubleshoot a crashed container](#troubleshoot-a-crashed-container) - [Get help with running a container](#get-help-with-running-a-container) @@ -134,9 +131,9 @@ $ docker ai "What can you do?" ### Troubleshoot a crashed container -If you try to start a container with an invalid configuration or command, you -can use Ask Gordon to troubleshoot the error. For example, try starting a -Postgres container without specifying a database password: +If you start a container with an invalid configuration or command, use Ask Gordon +to troubleshoot the error. For example, try starting a Postgres container without +a database password: ```console $ docker run postgres @@ -156,17 +153,16 @@ container's name, or inspect the container and open the **Ask Gordon** tab. ### Get help with running a container -If you want to run a specific image but you're not sure how, Gordon might be -able to help you get set up: +If you want to run a specific image but are not sure how, Gordon can help you get +set up: 1. Pull an image from Docker Hub (for example, `postgres`). -2. Open the **Images** view in Docker Desktop and select the image. -3. Select the **Run** button. +1. Open the **Images** view in Docker Desktop and select the image. +1. Select the **Run** button. -In the **Run a new container** dialog, you should see a message about -**Ask Gordon**. +In the **Run a new container** dialog, you see a message about **Ask Gordon**. -![Ask Gordon hint in Docker Desktop](../../images/gordon-run-ctr.png) +![Screenshot showing Ask Gordon hint in Docker Desktop.](../../images/gordon-run-ctr.png) The linked text in the hint is a suggested prompt to start a conversation with Ask Gordon. @@ -176,13 +172,13 @@ Ask Gordon. Gordon can analyze your Dockerfile and suggest improvements. To have Gordon evaluate your Dockerfile using the `docker ai` command: -1. Navigate to your project directory: +1. Go to your project directory: ```console - $ cd path/to/my/project + $ cd ``` -2. Use the `docker ai` command to rate your Dockerfile: +1. Use the `docker ai` command to rate your Dockerfile: ```console $ docker ai rate my Dockerfile @@ -202,8 +198,8 @@ across several dimensions: ### Migrate a Dockerfile to DHI -Migrating your Dockerfile to use [Docker Hardened Images](/manuals/dhi/_index.md) helps you build -more secure, minimal, and production-ready containers. DHIs are designed to +Migrating your Dockerfile to use [Docker Hardened Images](/manuals/dhi/_index.md) +helps you build more secure, minimal, and production-ready containers. DHIs reduce vulnerabilities, enforce best practices, and simplify compliance, making them a strong foundation for secure software supply chains. @@ -218,16 +214,15 @@ To request Gordon's help for the migration: If you've enabled Ask Gordon and you want to disable it again: 1. Open the **Settings** view in Docker Desktop. -2. Navigate to **Beta features**. -3. Clear the **Enable Docker AI** checkbox. -4. Select **Apply**. +1. Go to **Beta features**. +1. Clear the **Enable Docker AI** checkbox. +1. Select **Apply**. ### For organizations -If you want to disable Ask Gordon for your entire Docker organization, using -[Settings -Management](/manuals/enterprise/security/hardened-desktop/settings-management/_index.md), -add the following property to your `admin-settings.json` file: +To disable Ask Gordon for your entire Docker organization, use [Settings +Management](/manuals/enterprise/security/hardened-desktop/settings-management/_index.md) +and add this property to your `admin-settings.json` file: ```json { @@ -238,8 +233,7 @@ add the following property to your `admin-settings.json` file: } ``` -Alternatively, you can disable all Beta features by setting `allowBetaFeatures` -to false: +Or disable all Beta features by setting `allowBetaFeatures` to false: ```json { diff --git a/content/manuals/ai/gordon/mcp/_index.md b/content/manuals/ai/gordon/mcp/_index.md index ebbf14f51c62..70e10255208f 100644 --- a/content/manuals/ai/gordon/mcp/_index.md +++ b/content/manuals/ai/gordon/mcp/_index.md @@ -1,7 +1,7 @@ --- -title: MCP -description: Learn how to use MCP servers with Gordon -keywords: ai, mcp, gordon, docker desktop, docker, llm, +title: Model Context Protocol (MCP) +description: Learn how to use Model Context Protocol (MCP) servers with Gordon to extend AI capabilities in Docker Desktop. +keywords: ai, mcp, gordon, docker desktop, docker, llm, model context protocol grid: - title: Built-in tools description: Use the built-in tools. @@ -15,16 +15,14 @@ aliases: - /desktop/features/gordon/mcp/ --- -## What is MCP? - [Model Context Protocol](https://modelcontextprotocol.io/introduction) (MCP) is -an open protocol that standardizes how applications provide context and extra -functionality to large language models. MCP functions as a client-server -protocol, where the client, for example an application like Gordon, sends -requests, and the server processes those requests to deliver the necessary -context to the AI. This context may be gathered by the MCP server by executing -some code to perform an action and getting the result of the action, calling -external APIs, etc. +an open protocol that standardizes how applications provide context and +additional functionality to large language models. MCP functions as a +client-server protocol, where the client, for example an application like +Gordon, sends requests, and the server processes those requests to deliver the +necessary context to the AI. This context may be gathered by the MCP server by +executing code to perform an action and retrieving the result, calling external +APIs, or other similar operations. Gordon, along with other MCP clients like Claude Desktop or Cursor, can interact with MCP servers running as containers. diff --git a/content/manuals/ai/gordon/mcp/built-in-tools.md b/content/manuals/ai/gordon/mcp/built-in-tools.md index 9d253f0c445f..1e7287479bbb 100644 --- a/content/manuals/ai/gordon/mcp/built-in-tools.md +++ b/content/manuals/ai/gordon/mcp/built-in-tools.md @@ -1,40 +1,39 @@ --- -title: Built-in tools -description: How to use Gordon's built-in tools -keywords: ai, mcp, gordon +title: Built-in tools in Gordon +description: Use and configure Gordon's built-in tools for Docker, Kubernetes, security, and development workflows +keywords: ai, mcp, gordon, docker, kubernetes, security, developer tools, toolbox, configuration, usage aliases: - /desktop/features/gordon/mcp/built-in-tools/ --- -Gordon comes with an integrated toolbox providing access to various system tools -and capabilities. These tools extend Gordon's functionality by allowing it to -interact with the Docker Engine, Kubernetes, Docker Scout's security scanning, -and other developer utilities. This documentation covers the available tools, -their configuration, and usage patterns. +Gordon includes an integrated toolbox that gives you access to system tools and +capabilities. These tools extend Gordon's functionality so you can interact with +the Docker Engine, Kubernetes, Docker Scout security scanning, and other +developer utilities. This article describes the available tools, how to +configure them, and usage patterns. -## Configuration +## Configure tools -Tools can be configured globally in the toolbox, making them accessible -throughout the Gordon interfaces, including both Docker Desktop and the CLI. +Configure tools globally in the toolbox to make them available throughout +Gordon, including Docker Desktop and the CLI. -To configure: +To configure tools: -1. On the **Ask Gordon** view in Docker Desktop, select the `Toolbox` button in the bottom left of the input area. +1. In the **Ask Gordon** view in Docker Desktop, select the **Toolbox** button at the bottom left of the input area. - ![Gordon page with the toolbox button](../images/gordon.png) + ![Screenshot showing Gordon page with the toolbox button.](../images/gordon.png) -2. To enable or disable a tool, select it in the left-menu and select the toggle. +1. To enable or disable a tool, select it in the left menu and select the toggle. - ![Gordon's Toolbox](../images/toolbox.png) + ![Screenshot showing Gordon's Toolbox.](../images/toolbox.png) - For more information on the available Docker tools, see [Reference](#reference). + For more information about Docker tools, see [Reference](#reference). ## Usage examples -This section provides task-oriented examples for common operations with Gordon -tools. +This section shows common tasks you can perform with Gordon tools. -### Managing Docker containers +### Manage Docker containers #### List and monitor containers @@ -62,7 +61,7 @@ $ docker ai "Stop my database container" $ docker ai "Remove all stopped containers" ``` -### Working with Docker images +### Work with Docker images ```console # List available images @@ -78,7 +77,7 @@ $ docker ai "Build an image from my current directory and tag it as myapp:latest $ docker ai "Remove all my unused images" ``` -### Managing Docker volumes +### Manage Docker volumes ```console # List volumes @@ -87,11 +86,11 @@ $ docker ai "List all my Docker volumes" # Create a new volume $ docker ai "Create a new volume called postgres-data" -# Backup data from a container to a volume +# Back up data from a container to a volume $ docker ai "Create a backup of my postgres container data to a new volume" ``` -### Kubernetes operations +### Perform Kubernetes operations ```console # Create a deployment @@ -104,8 +103,7 @@ $ docker ai "Show me all deployments in the default namespace" $ docker ai "Show me logs from the auth-service pod" ``` -### Security analysis - +### Run security analysis ```console # Scan for CVEs @@ -115,7 +113,7 @@ $ docker ai "Scan my application for security vulnerabilities" $ docker ai "Give me recommendations for improving the security of my nodejs-app image" ``` -### Development workflows +### Use development workflows ```console # Analyze and commit changes @@ -127,107 +125,116 @@ $ docker ai "Show me the status of my current branch compared to main" ## Reference -This section provides a comprehensive listing of the built-in tools you can find -in Gordon's toolbox. +This section lists the built-in tools in Gordon's toolbox. ### Docker tools -Tools to interact with your Docker containers, images, and volumes. +Interact with Docker containers, images, and volumes. #### Container management | Name | Description | |---------------|----------------------------------------| -| `docker` | Access to the Docker CLI | -| `list_builds` | List the builds in the Docker daemon | -| `build_logs` | Show the build logs. | +| `docker` | Access the Docker CLI | +| `list_builds` | List builds in the Docker daemon | +| `build_logs` | Show build logs | #### Volume management -| Tool | Description | -|------|-------------| -| `list_volumes` | List all Docker volumes | -| `remove_volume` | Remove a Docker volume | -| `create_volume` | Create a new Docker volume | +| Tool | Description | +|----------------|---------------------------| +| `list_volumes` | List all Docker volumes | +| `remove_volume`| Remove a Docker volume | +| `create_volume`| Create a new Docker volume| #### Image management -| Tool | Description | -|------|-------------| -| `list_images` | List all Docker images | -| `remove_images` | Remove Docker images | -| `pull_image` | Pull an image from a registry | -| `push_image` | Push an image to a registry | -| `build_image` | Build a Docker image | -| `tag_image` | Tag a Docker image | -| `inspect` | Inspect a Docker object | +| Tool | Description | +|----------------|-------------------------------| +| `list_images` | List all Docker images | +| `remove_images`| Remove Docker images | +| `pull_image` | Pull an image from a registry | +| `push_image` | Push an image to a registry | +| `build_image` | Build a Docker image | +| `tag_image` | Tag a Docker image | +| `inspect` | Inspect a Docker object | ### Kubernetes tools -Tools to interact with your Kubernetes cluster +Interact with your Kubernetes cluster. -#### Pods +#### Pod management -| Tool | Description | -|------|-------------| -| `list_pods` | List all pods in the cluster | -| `get_pod_logs` | Get logs from a specific pod | +| Tool | Description | +|----------------|------------------------------------| +| `list_pods` | List all pods in the cluster | +| `get_pod_logs` | Get logs from a specific pod | #### Deployment management - -| Tool | Description | -|------|-------------| -| `list_deployments` | List all deployments | -| `create_deployment` | Create a new deployment | -| `expose_deployment` | Expose a deployment as a service | -| `remove_deployment` | Remove a deployment | +| Tool | Description | +|--------------------|------------------------------------| +| `list_deployments` | List all deployments | +| `create_deployment`| Create a new deployment | +| `expose_deployment`| Expose a deployment as a service | +| `remove_deployment`| Remove a deployment | #### Service management -| Tool | Description | -|------|-------------| -| `list_services` | List all services | -| `remove_service` | Remove a service | +| Tool | Description | +|----------------|---------------------------| +| `list_services`| List all services | +| `remove_service`| Remove a service | #### Cluster information -| Tool | Description | -|------|-------------| -| `list_namespaces` | List all namespaces | -| `list_nodes` | List all nodes in the cluster | +| Tool | Description | +|------------------|-----------------------------| +| `list_namespaces`| List all namespaces | +| `list_nodes` | List all nodes in the cluster| ### Docker Scout tools -Security analysis tools powered by Docker Scout. +Security analysis powered by Docker Scout. -| Tool | Description | -|------|-------------| -| `search_for_cves` | Analyze a Docker image, a project directory, or other artifacts for vulnerabilities using Docker Scout CVEs.search for cves | -| `get_security_recommendations` | Analyze a Docker image, a project directory, or other artifacts for base image update recommendations using Docker Scout. | +| Tool | Description | +|--------------------------------|-------------------------------------------------------------------------------------------------------------------------| +| `search_for_cves` | Analyze a Docker image, project directory, or other artifacts for vulnerabilities using Docker Scout CVEs. | +| `get_security_recommendations` | Analyze a Docker image, project directory, or other artifacts for base image update recommendations using Docker Scout. | ### Developer tools General-purpose development utilities. -| Tool | Description | -|------|-------------| -| `fetch` | Retrieve content from a URL | -| `get_command_help` | Get help for CLI commands | -| `run_command` | Execute shell commands | -| `filesystem` | Perform filesystem operations | -| `git` | Execute git commands | +| Tool | Description | +|-------------------|----------------------------------| +| `fetch` | Retrieve content from a URL | +| `get_command_help`| Get help for CLI commands | +| `run_command` | Execute shell commands | +| `filesystem` | Perform filesystem operations | +| `git` | Execute git commands | ### AI model tools -| Tool | Description | -|------|-------------| -| `list_models` | List all available Docker models | -| `pull_model` | Download an Docker model | -| `run_model` | Query a model with a prompt | -| `remove_model` | Remove an Docker model | +| Tool | Description | +|----------------|------------------------------------| +| `list_models` | List all available Docker models | +| `pull_model` | Download a Docker model | +| `run_model` | Query a model with a prompt | +| `remove_model` | Remove a Docker model | + +### Docker MCP catalog + +If you enable the [MCP Toolkit feature](../../mcp-catalog-and-toolkit/_index.md), +all the tools you enable and configure are available for Gordon to use. + +|----------------|----------------------------------| +| `list_models` | List all available Docker models | +| `pull_model` | Download an Docker model | +| `run_model` | Query a model with a prompt | +| `remove_model` | Remove an Docker model | ### Docker MCP Catalog -If you have enabled the [MCP Toolkit feature](../../mcp-catalog-and-toolkit/_index.md), +If you have enabled the [MCP Toolkit feature](../../mcp-catalog-and-toolkit/_index.md), all the tools you have enabled and configured are available for Gordon to use. diff --git a/content/manuals/ai/gordon/mcp/yaml.md b/content/manuals/ai/gordon/mcp/yaml.md index 806e2aaa7f65..cf17307acd42 100644 --- a/content/manuals/ai/gordon/mcp/yaml.md +++ b/content/manuals/ai/gordon/mcp/yaml.md @@ -1,29 +1,29 @@ --- -title: YAML configuration -description: Learn how to use MCP servers with Gordon -keywords: ai, mcp, gordon -aliases: +title: Configure MCP servers with YAML +description: Use MCP servers with Gordon +keywords: ai, mcp, gordon, yaml, configuration, docker compose, mcp servers, extensibility +aliases: - /desktop/features/gordon/mcp/yaml/ --- -Docker has partnered with Anthropic to build container images for the [reference -implementations](https://github.com/modelcontextprotocol/servers/) of MCP -servers available on Docker Hub under [the mcp -namespace](https://hub.docker.com/u/mcp). +Docker works with Anthropic to provide container images for the +[reference implementations](https://github.com/modelcontextprotocol/servers/) +of MCP servers. These are available on Docker Hub under +[the mcp namespace](https://hub.docker.com/u/mcp). -When you run the `docker ai` command in your terminal to ask a question, Gordon -looks in the `gordon-mcp.yml` file in your working directory (if present) for a -list of MCP servers that should be used when in that context. The -`gordon-mcp.yml` file is a Docker Compose file that configures MCP servers as -Compose services for Gordon to access. +When you run the `docker ai` command in your terminal, Gordon checks for a +`gordon-mcp.yml` file in your working directory. If present, this file lists +the MCP servers Gordon should use in that context. The `gordon-mcp.yml` file +is a Docker Compose file that configures MCP servers as Compose services for +Gordon to access. -The following minimal example shows how you can use the [mcp-time -server](https://hub.docker.com/r/mcp/time) to provide temporal capabilities to -Gordon. For more information, you can check out the [source code and -documentation](https://github.com/modelcontextprotocol/servers/tree/main/src/time). +The following minimal example shows how to use the +[mcp-time server](https://hub.docker.com/r/mcp/time) to provide temporal +capabilities to Gordon. For more details, see the +[source code and documentation](https://github.com/modelcontextprotocol/servers/tree/main/src/time). -Create the `gordon-mcp.yml` file in your working directory and add the time - server: +Create a `gordon-mcp.yml` file in your working directory and add the time +server: ```yaml services: @@ -31,26 +31,24 @@ services: image: mcp/time ``` -With this file present, you can now ask Gordon to tell you the time in - another timezone: +With this file present, you can now ask Gordon to tell you the time in another +timezone: - ```bash - $ docker ai 'what time is it now in kiribati?' - - • Calling get_current_time - - The current time in Kiribati (Tarawa) is 9:38 PM on January 7, 2025. - - ``` +```bash +$ docker ai 'what time is it now in kiribati?' + + • Calling get_current_time + + The current time in Kiribati (Tarawa) is 9:38 PM on January 7, 2025. +``` -As you can see, Gordon found the MCP time server and called its tool when -needed. +Gordon finds the MCP time server and calls its tool when needed. -## Advanced usage +## Use advanced MCP server features Some MCP servers need access to your filesystem or system environment variables. -Docker Compose can help with this. Since `gordon-mcp.yml` is a Compose file you -can add bind mounts using the regular Docker Compose syntax, which makes your +Docker Compose helps with this. Because `gordon-mcp.yml` is a Compose file, you +can add bind mounts using standard Docker Compose syntax. This makes your filesystem resources available to the container: ```yaml @@ -63,12 +61,12 @@ services: - .:/rootfs ``` -The `gordon-mcp.yml` file adds filesystem access capabilities to Gordon and -since everything runs inside a container Gordon only has access to the -directories you specify. +The `gordon-mcp.yml` file adds filesystem access capabilities to Gordon. Because +everything runs inside a container, Gordon only has access to the directories +you specify. -Gordon can handle any number of MCP servers. For example, if you give Gordon -access to the internet with the `mcp/fetch` server: +Gordon can use any number of MCP servers. For example, to give Gordon internet +access with the `mcp/fetch` server: ```yaml services: @@ -82,53 +80,66 @@ services: - .:/rootfs ``` -You can now ask things like: +You can now ask Gordon to fetch content and write it to a file: ```bash -$ docker ai can you fetch rumpl.dev and write the summary to a file test.txt +$ docker ai can you fetch rumpl.dev and write the summary to a file test.txt • Calling fetch ✔️ • Calling write_file ✔️ - - The summary of the website rumpl.dev has been successfully written to the file test.txt in the allowed directory. Let me know if you need further assistance! + The summary of the website rumpl.dev has been successfully written to the + file test.txt in the allowed directory. Let me know if you need further + assistance! -$ cat test.txt -The website rumpl.dev features a variety of blog posts and articles authored by the site owner. Here's a summary of the content: +$ cat test.txt +The website rumpl.dev features a variety of blog posts and articles authored +by the site owner. Here's a summary of the content: -1. **Wasmio 2023 (March 25, 2023)**: A recap of the WasmIO 2023 conference held in Barcelona. The author shares their experience as a speaker and praises the organizers for a successful event. +1. **Wasmio 2023 (March 25, 2023)**: A recap of the WasmIO 2023 conference + held in Barcelona. The author shares their experience as a speaker and + praises the organizers for a successful event. -2. **Writing a Window Manager in Rust - Part 2 (January 3, 2023)**: The second part of a series on creating a window manager in Rust. This installment focuses on enhancing the functionality to manage windows effectively. +2. **Writing a Window Manager in Rust - Part 2 (January 3, 2023)**: The + second part of a series on creating a window manager in Rust. This + installment focuses on enhancing the functionality to manage windows + effectively. -3. **2022 in Review (December 29, 2022)**: A personal and professional recap of the year 2022. The author reflects on the highs and lows of the year, emphasizing professional achievements. +3. **2022 in Review (December 29, 2022)**: A personal and professional recap + of the year 2022. The author reflects on the highs and lows of the year, + emphasizing professional achievements. -4. **Writing a Window Manager in Rust - Part 1 (December 28, 2022)**: The first part of the series on building a window manager in Rust. The author discusses setting up a Linux machine and the challenges of working with X11 and Rust. +4. **Writing a Window Manager in Rust - Part 1 (December 28, 2022)**: The + first part of the series on building a window manager in Rust. The author + discusses setting up a Linux machine and the challenges of working with + X11 and Rust. -5. **Add docker/docker to your dependencies (May 10, 2020)**: A guide for Go developers on how to use the Docker client library in their projects. The post includes a code snippet demonstrating the integration. +5. **Add docker/docker to your dependencies (May 10, 2020)**: A guide for Go + developers on how to use the Docker client library in their projects. The + post includes a code snippet demonstrating the integration. -6. **First (October 11, 2019)**: The inaugural post on the blog, featuring a simple "Hello World" program in Go. +6. **First (October 11, 2019)**: The inaugural post on the blog, featuring a + simple "Hello World" program in Go. ``` ## What’s next? -Now that you’ve learned how to use MCP servers with Gordon, here are a few ways -you can get started: +Now that you know how to use MCP servers with Gordon, try these next steps: - Experiment: Try integrating one or more of the tested MCP servers into your `gordon-mcp.yml` file and explore their capabilities. -- Explore the ecosystem: Check out the [reference implementations on - GitHub](https://github.com/modelcontextprotocol/servers/) or browse the - [Docker Hub MCP namespace](https://hub.docker.com/u/mcp) for additional - servers that might suit your needs. -- Build your own: If none of the existing servers meet your needs, or you’re - curious about exploring how they work in more detail, consider developing a - custom MCP server. Use the [MCP - specification](https://www.anthropic.com/news/model-context-protocol) as a - guide. -- Share your feedback: If you discover new servers that work well with Gordon - or encounter issues with existing ones, [share your findings to help improve - the ecosystem](https://docker.qualtrics.com/jfe/form/SV_9tT3kdgXfAa6cWa). - -With MCP support, Gordon offers powerful extensibility and flexibility to meet -your specific use cases whether you’re adding temporal awareness, file -management, or internet access. +- Explore the ecosystem. See the [reference implementations on + GitHub](https://github.com/modelcontextprotocol/servers/) or browse the + [Docker Hub MCP namespace](https://hub.docker.com/u/mcp) for more servers + that might suit your needs. +- Build your own. If none of the existing servers meet your needs, or you want + to learn more, develop a custom MCP server. Use the + [MCP specification](https://www.anthropic.com/news/model-context-protocol) + as a guide. +- Share your feedback. If you discover new servers that work well with Gordon + or encounter issues, [share your findings to help improve the + ecosystem](https://docker.qualtrics.com/jfe/form/SV_9tT3kdgXfAa6cWa). + +With MCP support, Gordon gives you powerful extensibility and flexibility for +your use cases, whether you need temporal awareness, file management, or +internet access. diff --git a/content/manuals/ai/mcp-catalog-and-toolkit/_index.md b/content/manuals/ai/mcp-catalog-and-toolkit/_index.md index c91713b27fc5..dfc27b92f7a2 100644 --- a/content/manuals/ai/mcp-catalog-and-toolkit/_index.md +++ b/content/manuals/ai/mcp-catalog-and-toolkit/_index.md @@ -16,19 +16,26 @@ grid: icon: hub link: /ai/mcp-catalog-and-toolkit/catalog/ - title: MCP Toolkit - description: Learn about the MCP toolkit to manage MCP servers and clients + description: Learn about the MCP Toolkit to manage MCP servers and clients icon: /icons/toolkit.svg link: /ai/mcp-catalog-and-toolkit/toolkit/ --- -The Model Context Protocol (MCP) is a modern standard that transforms AI agents from passive responders into action-oriented systems. By standardizing how tools are described, discovered, and invoked, MCP enables agents to securely query APIs, access data, and execute services across diverse environments. +The Model Context Protocol (MCP) is a modern standard that transforms AI agents +from passive responders into action-oriented systems. By standardizing how tools +are described, discovered, and invoked, MCP enables agents to securely query +APIs, access data, and run services across different environments. -As agents move into production, MCP solves common integration challenges — interoperability, reliability, and security — by providing a consistent, decoupled, and scalable interface between agents and tools. Just as containers redefined software deployment, MCP is reshaping how AI systems interact with the world. +As agents move into production, MCP solves common integration challenges — +interoperability, reliability, and security — by providing a consistent, +decoupled, and scalable interface between agents and tools. Just as containers +redefined software deployment, MCP is reshaping how AI systems interact with the +world. > **Example** -> +> > In simple terms, an MCP server is a way for an LLM to interact with an external system. -> +> > For example: > If you ask a model to create a meeting, it needs to communicate with your calendar app to do that. > An MCP server for your calendar app provides _tools_ that perform atomic actions, such as: @@ -36,17 +43,19 @@ As agents move into production, MCP solves common integration challenges — int ## What is Docker MCP Catalog and Toolkit? -Docker MCP Catalog and Toolkit is a comprehensive solution for securely building, sharing, and running MCP tools. It simplifies the developer experience across these key areas: +Docker MCP Catalog and Toolkit is a solution for securely building, sharing, and +running MCP tools. It simplifies the developer experience across these areas: -- Discovery: A central catalog with verified, versioned tools -- Credential Management: OAuth-based and secure by default -- Execution: Tools run in isolated, containerized environments -- Portability: Use MCP tools across Claude, Cursor, VS Code, and more — no code changes needed +- Discovery: A central catalog with verified, versioned tools. +- Credential management: OAuth-based and secure by default. +- Execution: Tools run in isolated, containerized environments. +- Portability: Use MCP tools across Claude, Cursor, VS Code, and more—no code + changes needed. With Docker Hub and the MCP Toolkit, you can: -- Launch MCP servers in seconds -- Add tools via CLI or GUI -- Rely on Docker's pull-based infrastructure for trusted delivery +- Launch MCP servers in seconds. +- Add tools using the CLI or GUI. +- Rely on Docker's pull-based infrastructure for trusted delivery. {{< grid >}} diff --git a/content/manuals/ai/mcp-catalog-and-toolkit/catalog.md b/content/manuals/ai/mcp-catalog-and-toolkit/catalog.md index 7526a7833fb1..705145b82400 100644 --- a/content/manuals/ai/mcp-catalog-and-toolkit/catalog.md +++ b/content/manuals/ai/mcp-catalog-and-toolkit/catalog.md @@ -4,52 +4,62 @@ description: Learn about the benefits of the MCP Catalog, how you can use it, an keywords: docker hub, mcp, mcp servers, ai agents, catalog, docker --- -The [Docker MCP Catalog](https://hub.docker.com/mcp) is a centralized, trusted registry for discovering, sharing, and running MCP-compatible tools. Seamlessly integrated into Docker Hub, it offers verified, versioned, and curated MCP servers packaged as Docker images. The catalog is also available in Docker Desktop. +The [Docker MCP Catalog](https://hub.docker.com/mcp) is a centralized, trusted +registry for discovering, sharing, and running MCP-compatible tools. Integrated +with Docker Hub, it offers verified, versioned, and curated MCP servers +packaged as Docker images. The catalog is also available in Docker Desktop. The catalog solves common MCP server challenges: -- Environment conflicts: Tools often need specific runtimes that may clash with existing setups. -- Lack of isolation: Traditional setups risk exposing the host system. -- Setup complexity: Manual installation and configuration result in slow adoption. -- Inconsistency across platforms: Tools may behave unpredictably on different OSes. +- Environment conflicts. Tools often need specific runtimes that might clash + with existing setups. +- Lack of isolation. Traditional setups risk exposing the host system. +- Setup complexity. Manual installation and configuration slow adoption. +- Inconsistency across platforms. Tools might behave unpredictably on different + operating systems. -With Docker, each MCP server runs as a self-contained container so it is -portable, isolated, and consistent. You can launch tools instantly using Docker -CLI or Docker Desktop, without worrying about dependencies or compatibility. +With Docker, each MCP server runs as a self-contained container. This makes it +portable, isolated, and consistent. You can launch tools instantly using the +Docker CLI or Docker Desktop, without worrying about dependencies or +compatibility. ## Key features -- Over 100 verified MCP servers in one place -- Publisher verification and versioned releases -- Pull-based distribution using Docker's infrastructure -- Tools provided by partners such as New Relic, Stripe, Grafana, and more +- Over 100 verified MCP servers in one place. +- Publisher verification and versioned releases. +- Pull-based distribution using Docker infrastructure. +- Tools provided by partners such as New Relic, Stripe, Grafana, and more. ## How it works -Each tool in the MCP Catalog is packaged as a Docker image with metadata: +Each tool in the MCP Catalog is packaged as a Docker image with metadata. -- Discover tools via Docker Hub under the `mcp/` namespace. -- Connect tools to their preferred agents with simple configuration through the [MCP Toolkit](toolkit.md). +- Discover tools on Docker Hub under the `mcp/` namespace. +- Connect tools to your preferred agents with simple configuration through the + [MCP Toolkit](toolkit.md). - Pull and run tools using Docker Desktop or the CLI. Each catalog entry displays: -- Tool description and metadata -- Version history -- List of tools provided by the MCP server -- Example configuration for agent integration +- Tool description and metadata. +- Version history. +- List of tools provided by the MCP server. +- Example configuration for agent integration. ## Use an MCP server from the catalog -To use an MCP server from the catalog, see [MCP toolkit](toolkit.md). +To use an MCP server from the catalog, see [MCP Toolkit](toolkit.md). ## Contribute an MCP server to the catalog -The MCP server registry is available at https://github.com/docker/mcp-registry. To submit an MCP server, -follow the [contributing guidelines](https://github.com/docker/mcp-registry/blob/main/CONTRIBUTING.md). +The MCP server registry is available at +https://github.com/docker/mcp-registry. To submit an MCP server, follow the +[contributing guidelines](https://github.com/docker/mcp-registry/blob/main/CONTRIBUTING.md). -When your pull request is reviewed and approved, your MCP server is available in 24 hours on: +When your pull request is reviewed and approved, your MCP server is available +within 24 hours on: -- Docker Desktop's [MCP Toolkit feature](toolkit.md) -- The [Docker MCP catalog](https://hub.docker.com/mcp) -- The [Docker Hub](https://hub.docker.com/u/mcp) `mcp` namespace (for MCP servers built by Docker) +- Docker Desktop's [MCP Toolkit feature](toolkit.md). +- The [Docker MCP Catalog](https://hub.docker.com/mcp). +- The [Docker Hub](https://hub.docker.com/u/mcp) `mcp` namespace (for MCP + servers built by Docker). diff --git a/content/manuals/ai/mcp-catalog-and-toolkit/toolkit.md b/content/manuals/ai/mcp-catalog-and-toolkit/toolkit.md index cba0d935cfec..f0ca96eeb727 100644 --- a/content/manuals/ai/mcp-catalog-and-toolkit/toolkit.md +++ b/content/manuals/ai/mcp-catalog-and-toolkit/toolkit.md @@ -7,7 +7,11 @@ aliases: - /ai/gordon/mcp/gordon-mcp-server/ --- -The Docker MCP Toolkit is a gateway that enables seamless setup, management, and execution of containerized MCP servers and their connections to AI agents. It removes the friction from tool usage by offering secure defaults, one-click setup, and support for a growing ecosystem of LLM-based clients. It is the fastest path from MCP tool discovery to local execution. +The Docker MCP Toolkit is a gateway that lets you set up, manage, and run +containerized MCP servers and connect them to AI agents. It removes friction +from tool usage by offering secure defaults, one-click setup, and support for a +growing ecosystem of LLM-based clients. It is the fastest way from MCP tool +discovery to local execution. ## Key features @@ -16,34 +20,33 @@ The Docker MCP Toolkit is a gateway that enables seamless setup, management, and - Zero manual setup: No dependency management, runtime configuration, or server setup required. - Functions as both an MCP server aggregator and a gateway for clients to access installed MCP servers. -## How the MCP toolkit works +## How the MCP Toolkit works MCP introduces two core concepts: MCP clients and MCP servers. -- MCP clients are typically embedded in LLM-based applications, such as - the Claude Desktop App. They request resources or actions. -- MCP servers are launched by the client to perform the requested tasks, - using any necessary tools, languages, or processes. +- MCP clients are typically embedded in LLM-based applications, such as the + Claude Desktop app. They request resources or actions. +- MCP servers are launched by the client to perform the requested tasks, using + any necessary tools, languages, or processes. Docker standardizes the development, packaging, and distribution of applications, including MCP servers. By packaging MCP servers as containers, -Docker eliminates issues related to isolation and environment differences. Users +Docker eliminates issues related to isolation and environment differences. You can run a container directly, without managing dependencies or configuring runtimes. -Depending on the MCP server, the tools it provides may run within the same container -as the server or in dedicated containers: - +Depending on the MCP server, the tools it provides might run within the same +container as the server or in dedicated containers: {{< tabs group="" >}} {{< tab name="Single container">}} -![Visualisation of the MCP toolkit](/assets/images/mcp_servers.png) +![Screenshot showing a single-container MCP Toolkit setup.](/assets/images/mcp_servers.png) {{< /tab >}} {{< tab name="Separate containers">}} -![Visualisation of the MCP toolkit](/assets/images/mcp_servers_2.png) +![Screenshot showing a multi-container MCP Toolkit setup.](/assets/images/mcp_servers_2.png) {{< /tab >}} {{}} @@ -134,7 +137,7 @@ can interact with the installed MCP servers, turning the MCP Toolkit into a gate To install a client: 1. In Docker Desktop, select **MCP Toolkit** and select the **Clients** tab. -2. Find the client of your choice and select **Connect**. +1. Find the client of your choice and select **Connect**. Your client can now interact with the MCP Toolkit. @@ -146,10 +149,10 @@ You can simply install these 2 MCP servers in the MCP Toolkit, and add Claude Desktop as a client: 1. From the **MCP Toolkit** menu, select the **Catalog** tab and find the **Puppeteer** server and add it. -2. Repeat for the **GitHub Official** server. -3. From the **Clients** tab, select **Connect** next to **Claude Desktop**. Restart +1. Repeat for the **GitHub Official** server. +1. From the **Clients** tab, select **Connect** next to **Claude Desktop**. Restart Claude Desktop if it's running, and it can now access all the servers in the MCP Toolkit. -4. Within Claude Desktop, run a test by submitting the following prompt using the Sonnet 3.5 model: +1. Within Claude Desktop, run a test by submitting the following prompt using the Sonnet 3.5 model: ```text Take a screenshot of docs.docker.com and then invert the colors diff --git a/content/manuals/ai/mcp-gateway/_index.md b/content/manuals/ai/mcp-gateway/_index.md index 36fbac551a51..75fae82b571e 100644 --- a/content/manuals/ai/mcp-gateway/_index.md +++ b/content/manuals/ai/mcp-gateway/_index.md @@ -107,4 +107,4 @@ To view all the commands and configuration options, go to the [mcp-gateway repos ## Related pages -- [Docker MCP toolkit and catalog](/manuals/ai/mcp-catalog-and-toolkit/_index.md) +- [Docker MCP Toolkit and catalog](/manuals/ai/mcp-catalog-and-toolkit/_index.md) diff --git a/content/manuals/ai/model-runner/_index.md b/content/manuals/ai/model-runner/_index.md index f21e329ada88..75a5b70df300 100644 --- a/content/manuals/ai/model-runner/_index.md +++ b/content/manuals/ai/model-runner/_index.md @@ -74,36 +74,45 @@ Docker Engine only: {{< /tab >}} {{}} - ## How it works -Models are pulled from Docker Hub the first time they're used and stored locally. They're loaded into memory only at runtime when a request is made, and unloaded when not in use to optimize resources. Since models can be large, the initial pull may take some time — but after that, they're cached locally for faster access. You can interact with the model using [OpenAI-compatible APIs](#what-api-endpoints-are-available). +Models are pulled from Docker Hub the first time you use them and are stored +locally. They load into memory only at runtime when a request is made, and +unload when not in use to optimize resources. Because models can be large, the +initial pull may take some time. After that, they're cached locally for faster +access. You can interact with the model using +[OpenAI-compatible APIs](#what-api-endpoints-are-available). > [!TIP] > > Using Testcontainers or Docker Compose? > [Testcontainers for Java](https://java.testcontainers.org/modules/docker_model_runner/) > and [Go](https://golang.testcontainers.org/modules/dockermodelrunner/), and -> [Docker Compose](/manuals/ai/compose/models-and-compose.md) now support Docker Model Runner. +> [Docker Compose](/manuals/ai/compose/models-and-compose.md) now support Docker +> Model Runner. ## Enable Docker Model Runner ### Enable DMR in Docker Desktop -1. In the settings view, navigate to the **Beta features** tab. -1. Tick the **Enable Docker Model Runner** setting. -1. If you are running on Windows with a supported NVIDIA GPU, you should also see and be able to tick the **Enable GPU-backed inference** setting. -1. Optional: If you want to enable TCP support, select the **Enable host-side TCP support** - 1. In the **Port** field, type the port of your choice. - 1. If you are interacting with Model Runner from a local frontend web app, - in **CORS Allows Origins**, select the origins that Model Runner should accept requests from. - An origin is the URL where your web app is running, for example `http://localhost:3131`. +1. In the settings view, go to the **Beta features** tab. +1. Select the **Enable Docker Model Runner** setting. +1. If you use Windows with a supported NVIDIA GPU, you also see and can select + **Enable GPU-backed inference**. +1. Optional: To enable TCP support, select **Enable host-side TCP support**. + 1. In the **Port** field, type the port you want to use. + 1. If you interact with Model Runner from a local frontend web app, in + **CORS Allows Origins**, select the origins that Model Runner should + accept requests from. An origin is the URL where your web app runs, for + example `http://localhost:3131`. -You can now use the `docker model` command in the CLI and view and interact with your local models in the **Models** tab in the Docker Desktop Dashboard. +You can now use the `docker model` command in the CLI and view and interact +with your local models in the **Models** tab in the Docker Desktop Dashboard. > [!IMPORTANT] > -> For Docker Desktop versions 4.41 and earlier, this setting lived under the **Experimental features** tab on the **Features in development** page. +> For Docker Desktop versions 4.41 and earlier, this setting was under the +> **Experimental features** tab on the **Features in development** page. ### Enable DMR in Docker Engine @@ -141,7 +150,9 @@ You can now use the `docker model` command in the CLI and view and interact with ### Update DMR in Docker Engine -To update Docker Model Runner in Docker Engine, uninstall it with [`docker model uninstall-runner`](/reference/cli/docker/model/uninstall-runner/) then reinstall it: +To update Docker Model Runner in Docker Engine, uninstall it with +[`docker model uninstall-runner`](/reference/cli/docker/model/uninstall-runner/) +then reinstall it: ```console docker model uninstall-runner --images && docker model install-runner @@ -149,7 +160,8 @@ docker model uninstall-runner --images && docker model install-runner > [!NOTE] > With the above command, local models are preserved. -> To delete the models during the upgrade, add the `--models` option to the `uninstall-runner` command. +> To delete the models during the upgrade, add the `--models` option to the +> `uninstall-runner` command. ## Pull a model @@ -157,20 +169,22 @@ Models are cached locally. > [!NOTE] > -> When working with the Docker CLI, you can also pull models directly from [HuggingFace](https://huggingface.co/). +> When you use the Docker CLI, you can also pull models directly from +> [HuggingFace](https://huggingface.co/). {{< tabs group="release" >}} {{< tab name="From Docker Desktop">}} 1. Select **Models** and select the **Docker Hub** tab. -1. Find the model of your choice and select **Pull**. +1. Find the model you want and select **Pull**. -![screencapture of the Docker Hub view](./images/dmr-catalog.png) +![Screenshot showing the Docker Hub view.](./images/dmr-catalog.png) {{< /tab >}} {{< tab name="From the Docker CLI">}} -Use the [`docker model pull` command](/reference/cli/docker/model/pull/). For example: +Use the [`docker model pull` command](/reference/cli/docker/model/pull/). +For example: ```bash {title="Pulling from Docker Hub"} docker model pull ai/smollm2:360M-Q4_K_M @@ -188,10 +202,10 @@ docker model pull hf.co/bartowski/Llama-3.2-1B-Instruct-GGUF {{< tabs group="release" >}} {{< tab name="From Docker Desktop">}} -1. Select **Models** and select the **Local** tab -1. Click the play button. The interactive chat screen opens. +1. Select **Models** and select the **Local** tab. +1. Select the play button. The interactive chat screen opens. -![screencapture of the Local view](./images/dmr-run.png) +![Screenshot showing the Local view.](./images/dmr-run.png) {{< /tab >}} {{< tab name="From the Docker CLI" >}} @@ -203,14 +217,14 @@ Use the [`docker model run` command](/reference/cli/docker/model/run/). ## Troubleshooting -To troubleshoot potential issues, display the logs: +To troubleshoot issues, display the logs: {{< tabs group="release" >}} {{< tab name="From Docker Desktop">}} Select **Models** and select the **Logs** tab. -![screencapture of the Models view](./images/dmr-logs.png) +![Screenshot showing the Models view.](./images/dmr-logs.png) {{< /tab >}} {{< tab name="From the Docker CLI">}} @@ -224,9 +238,11 @@ Use the [`docker model logs` command](/reference/cli/docker/model/logs/). > [!NOTE] > -> This works for any Container Registry supporting OCI Artifacts, not only Docker Hub. +> This works for any Container Registry supporting OCI Artifacts, not only +> Docker Hub. -You can tag existing models with a new name and publish them under a different namespaceand repository: +You can tag existing models with a new name and publish them under a different +namespace and repository: ```console # Tag a pulled model under a new name @@ -236,27 +252,33 @@ $ docker model tag ai/smollm2 myorg/smollm2 $ docker model push myorg/smollm2 ``` -For more details, see the [`docker model tag`](/reference/cli/docker/model/tag) and [`docker model push`](/reference/cli/docker/model/push) command documentation. +For more details, see the [`docker model tag`](/reference/cli/docker/model/tag) +and [`docker model push`](/reference/cli/docker/model/push) command +documentation. -You can also directly package a model file in GGUF format as an OCI Artifact and publish it to Docker Hub. +You can also package a model file in GGUF format as an OCI Artifact and publish +it to Docker Hub. ```console -# Download a model file in GGUF format, e.g. from HuggingFace +# Download a model file in GGUF format, for example from HuggingFace $ curl -L -o model.gguf https://huggingface.co/TheBloke/Mistral-7B-v0.1-GGUF/resolve/main/mistral-7b-v0.1.Q4_K_M.gguf # Package it as OCI Artifact and push it to Docker Hub $ docker model package --gguf "$(pwd)/model.gguf" --push myorg/mistral-7b-v0.1:Q4_K_M ``` -For more details, see the [`docker model package`](/reference/cli/docker/model/package/) command documentation. +For more details, see the +[`docker model package`](/reference/cli/docker/model/package/) command +documentation. ## Example: Integrate Docker Model Runner into your software development lifecycle ### Sample project -You can now start building your Generative AI application powered by the Docker Model Runner. +You can now start building your generative AI application powered by Docker +Model Runner. -If you want to try an existing GenAI application, follow these instructions. +If you want to try an existing GenAI application, follow these steps: 1. Set up the sample app. Clone and run the following repository: @@ -264,21 +286,24 @@ If you want to try an existing GenAI application, follow these instructions. $ git clone https://github.com/docker/hello-genai.git ``` -2. In your terminal, navigate to the `hello-genai` directory. +1. In your terminal, go to the `hello-genai` directory. -3. Run `run.sh` for pulling the chosen model and run the app(s): +1. Run `run.sh` to pull the chosen model and run the app. -4. Open you app in the browser at the addresses specified in the repository [README](https://github.com/docker/hello-genai). +1. Open your app in the browser at the addresses specified in the repository + [README](https://github.com/docker/hello-genai). -You'll see the GenAI app's interface where you can start typing your prompts. +You see the GenAI app's interface where you can start typing your prompts. -You can now interact with your own GenAI app, powered by a local model. Try a few prompts and notice how fast the responses are — all running on your machine with Docker. +You can now interact with your own GenAI app, powered by a local model. Try a +few prompts and notice how fast the responses are — all running on your machine +with Docker. ### Use Model Runner in GitHub Actions -Here is an example on how to use Model Runner as part of a GitHub workflow. -The example installs Model Runner, tests the installation, pulls and runs a model, -interacts with the model via the API and finally deletes the model. +Here is an example of how to use Model Runner as part of a GitHub workflow. +The example installs Model Runner, tests the installation, pulls and runs a +model, interacts with the model via the API, and deletes the model. ```yaml {title="dmr-run.yml", collapse=true} name: Docker Model Runner Example Workflow diff --git a/content/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md b/content/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md index 2bb9ea66fe30..1bf0b7871e76 100644 --- a/content/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md +++ b/content/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md @@ -321,7 +321,7 @@ For more information, see [Networking](/manuals/desktop/features/networking.md#n |         `enableInferenceTCP` | | Enable host-side TCP support. This setting requires Docker Model Runner setting to be enabled first. | | |         `enableInferenceTCPPort` | | Specifies the exposed TCP port. This setting requires Docker Model Runner setting to be enabled first. | | |         `enableInferenceCORS` | | Specifies the allowed CORS origins. Empty string to deny all,`*` to accept all, or a list of comma-separated values. This setting requires Docker Model Runner setting to be enabled first. | | -| `enableDockerMCPToolkit` | | If `allowBetaFeatures` is true, setting `enableDockerMCPToolkit` to `true` enables the [MCP toolkit feature](/manuals/ai/mcp-catalog-and-toolkit/toolkit.md) by default. You can independently control this setting from the `allowBetaFeatures` setting. | | +| `enableDockerMCPToolkit` | | If `allowBetaFeatures` is true, setting `enableDockerMCPToolkit` to `true` enables the [MCP Toolkit feature](/manuals/ai/mcp-catalog-and-toolkit/toolkit.md) by default. You can independently control this setting from the `allowBetaFeatures` setting. | | | `allowExperimentalFeatures` | | If `value` is set to `true`, experimental features are enabled. | Docker Desktop version 4.41 and earlier | ### Enhanced Container Isolation