You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,20 +1,20 @@
1
1
NetApp DataOps Toolkit
2
2
=========
3
3
4
-
The NetApp DataOps Toolkit is a Python-based tool that simplifies the management of development/training workspaces and inference servers that are backed by high-performance, scale-out NetApp storage. Key capabilities include:
5
-
- Rapidly provision new high-capacity JupyterLab workspaces that are backed by high-performance, scale-out NetApp storage.
6
-
- Rapidly provision new NVIDIA Triton Inference Server instances that are backed by enterprise-class NetApp storage.
7
-
- Near-instantaneously clone high-capacity JupyterLab workspaces in order to enable experimentation or rapid iteration.
8
-
- Near-instantaneously save snapshots of high-capacity JupyterLab workspaces for backup and/or traceability/baselining.
9
-
- Near-instantaneously provision, clone, and snapshot high-capacity, high-performance data volumes.
4
+
The NetApp DataOps Toolkit is a collection of Python-based client tools that simplify the management of data volumes and data science/engineering workspaces that are backed by high-performance, scale-out NetApp storage. Key capabilities include:
5
+
- Rapidly provision new data volumes (file shares) or JupyterLab workspaces that are backed by high-performance, scale-out NetApp storage.
6
+
- Near-instantaneously clone data volumes (file shares) or JupyterLab workspaces in order to enable experimentation or rapid iteration.
7
+
- Near-instantaneously save snapshots of data volumes (file shares) or JupyterLab workspaces for backup and/or traceability/baselining.
8
+
- Replicate data volumes (file shares) across different environments.
10
9
11
-
## Getting Started
10
+
The toolkit includes [MCP Servers](mcp_servers.md) that expose many of these capabilities as "tools" that can be utilized by AI agents.
12
11
13
-
The latest stable release of the NetApp DataOps Toolkit is version 2.5.0. It is recommended to always use the latest stable release. You can access the documentation for the latest stable release [here](https://github.com/NetApp/netapp-dataops-toolkit/tree/v2.5.0)
12
+
## Getting Started
14
13
15
-
The NetApp DataOps Toolkit comes in two different flavors. For access to the most capabilities, we recommend using the [NetApp DataOps Toolkit for Kubernetes](netapp_dataops_k8s/). This flavor supports the full functionality of the toolkit, including JupyterLab workspace and NVIDIA Triton Inference Server management capabilities, but requires access to a Kubernetes cluster.
14
+
The NetApp DataOps Toolkit includes the following client tools:
16
15
17
-
If you do not have access to a Kubernetes cluster, then you can use the [NetApp DataOps Toolkit for Traditional Environments](netapp_dataops_traditional/). However, this flavor only supports data volume management capabilities. It does not support the JupyterLab workspace and NVIDIA Triton Inference Server management capabilities that are available with the NetApp DataOps Toolkit for Kubernetes.
16
+
- The [NetApp DataOps Toolkit for Kubernetes](netapp_dataops_k8s/) includes data volume management, JupyterLab management, and data movement capabilities for users that have access to a Kubernetes cluster.
17
+
- The [NetApp DataOps Toolkit for Traditional Environments](netapp_dataops_traditional/) includes basic data volume management capabilities. It will run on most Linux and macOS clients, and does not require Kubernetes.
# MCP Servers Included in the NetApp DataOps Toolkit
2
+
3
+
The NetApp DataOps Toolkit includes multiple MCP Servers. These MCP Servers expose DataOps Toolkit capabilities as "tools" that can be utilized by AI agents.
4
+
5
+
## NetApp DataOps Toolkit MCP Server for ONTAP
6
+
7
+
The [NetApp DataOps Toolkit MCP Server for ONTAP](netapp_dataops_traditional/docs/mcp_server.md) is an MCP Server that enables AI agents to manage volumes and snapshots on an ONTAP system, including NetApp AFF and FAS appliances, Amazon FSx for NetApp ONTAP instances, and NetApp Cloud Volumes ONTAP instances.
8
+
9
+
### Available Tools
10
+
11
+
-**Create Volume**: Rapidly provision new data volumes with customizable configurations.
12
+
-**Clone Volume**: Create near-instantaneous, space-efficient clones of existing volumes using NetApp FlexClone technology.
13
+
-**List Volumes**: Retrieve a list of all existing data volumes, with optional space usage details.
14
+
-**Mount Volume**: Mount existing data volumes locally as read-only or read-write.
15
+
-**Create Snapshot**: Create space-efficient, read-only copies of data volumes for versioning and traceability.
16
+
-**List Snapshots**: Retrieve a list of all snapshots for a specific volume.
17
+
-**Create SnapMirror Relationship**: Set up SnapMirror relationships for efficient data replication.
18
+
-**List SnapMirror Relationships**: Retrieve a list of all SnapMirror relationships on the storage system.
19
+
20
+
## NetApp DataOps Toolkit for Kubernetes MCP Server
21
+
22
+
The [NetApp DataOps Toolkit for Kubernetes MCP Server](netapp_dataops_k8s/docs/mcp_server_k8s.md) is an MCP Server that enables AI agents to manage persistent volumes, volume snapshots, and JupyterLab workspaces within a Kubernetes cluster. This MCP server relies on the NetApp Trident CSI driver and is compatible with NetApp AFF and FAS appliances, Amazon FSx for NetApp ONTAP, Azure NetApp Files, Google Cloud NetApp Volumes, and NetApp Cloud Volumes ONTAP.
23
+
24
+
### Available Tools
25
+
26
+
This MCP Server provides the following tools for managing JupyterLab workspaces and volumes in a Kubernetes environment:
27
+
28
+
#### Workspace Management Tools
29
+
30
+
-**CreateJupyterLab**: Create a new JupyterLab workspace.
31
+
-**CloneJupyterLab**: Clone an existing JupyterLab workspace.
32
+
-**ListJupyterLabs**: List all JupyterLab workspaces.
33
+
-**CreateJupyterLabSnapshot**: Create a snapshot of a JupyterLab workspace.
34
+
-**ListJupyterLabSnapshots**: List all snapshots of JupyterLab workspaces.
35
+
36
+
#### Volume Management Tools
37
+
38
+
-**CreateVolume**: Create a new volume.
39
+
-**CloneVolume**: Clone an existing volume.
40
+
-**ListVolumes**: List all volumes.
41
+
-**CreateVolumeSnapshot**: Create a snapshot of a volume.
42
+
-**ListVolumeSnapshots**: List all snapshots of volumes.
Copy file name to clipboardExpand all lines: netapp_dataops_k8s/README.md
+10-5Lines changed: 10 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
NetApp DataOps Toolkit for Kubernetes
2
2
=========
3
3
4
-
The NetApp DataOps Toolkit for Kubernetes is a Python library that makes it simple for developers, data scientists, DevOps engineers, and data engineers to perform various data management tasks within a Kubernetes cluster. Some of the key capabilities that the toolkit provides are the ability to provision a new persistent volume or data science workspace, the ability to almost instantaneously clone a volume or workspace, the ability to almost instantaneously save off a snapshot of a volume or workspace for traceability/baselining, and the ability to move data between S3 compatible object storage and a Kubernetes persistent volume.
4
+
The NetApp DataOps Toolkit for Kubernetes is a Python library that makes it simple for developers, data scientists, DevOps engineers, and data engineers to perform various data management tasks within a Kubernetes cluster. Some of the key capabilities that the toolkit provides are the ability to provision a new persistent volume or data science workspace, the ability to almost instantaneously clone a volume or workspace, the ability to almost instantaneously save off a snapshot of a volume or workspace for traceability/baselining, and the ability to move data between S3 compatible object storage and a Kubernetes persistent volume. The toolkit also includes an [MCP Server](docs/mcp_server_k8s.md) that exposes many of the capabilities as "tools" that can be utilized by AI agents.
5
5
6
6
## Compatibility
7
7
@@ -15,16 +15,17 @@ The toolkit is currently compatible with Trident versions 20.07 and above. Addit
15
15
16
16
- ontap-nas
17
17
- ontap-nas-flexgroup
18
-
- gcp-cvs
19
18
- azure-netapp-files
19
+
- google-cloud-netapp-volume
20
+
- gcp-cvs
20
21
21
22
The toolkit is currently compatible with all versions of the BeeGFS CSI driver, though not all functionality is supported by BeeGFS. Operations that are not supported by BeeGFS are noted within the documentation.
22
23
23
24
## Installation
24
25
25
26
### Prerequisites
26
27
27
-
The NetApp DataOps Toolkit for Kubernetes requires that Python 3.8, 3.9, 3.10, or 3.11 be installed on the local host. Additionally, the toolkit requires that pip for Python3 be installed on the local host. For more details regarding pip, including installation instructions, refer to the [pip documentation](https://pip.pypa.io/en/stable/installing/).
28
+
The NetApp DataOps Toolkit for Kubernetes requires that Python 3.8, 3.9, 3.10, 3,11, 3.12, or 3.13 be installed on the local host. Additionally, the toolkit requires that pip for Python3 be installed on the local host. For more details regarding pip, including installation instructions, refer to the [pip documentation](https://pip.pypa.io/en/stable/installing/).
The NetApp DataOps Toolkit for Kubernetes can be utilized from any Linux or macOS host that has network access to the Kubernetes cluster.
42
+
The NetApp DataOps Toolkit for Kubernetes can be utilized from any Linux or macOS client that has network access to the Kubernetes cluster.
42
43
43
-
The toolkit requires that a valid kubeconfig file be present on the local host, located at `$HOME/.kube/config` or at another path specified by the `KUBECONFIG` environment variable. Refer to the [Kubernetes documentation](https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/) for more information regarding kubeconfig files.
44
+
The toolkit requires that a valid kubeconfig file be present on the client, located at `$HOME/.kube/config` or at another path specified by the `KUBECONFIG` environment variable. Refer to the [Kubernetes documentation](https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/) for more information regarding kubeconfig files.
@@ -71,6 +72,10 @@ Refer to the [Kubernetes documentation](https://kubernetes.io/docs/tasks/run-app
71
72
72
73
The NetApp DataOps Toolkit for Kubernetes provides the following capabilities.
73
74
75
+
### MCP Server
76
+
77
+
The NetApp DataOps Toolkit for Kubernetes includes an [MCP Server](docs/mcp_server_k8s.md) that exposes many of the [Workspace Management](docs/workspace_management.md) and [Volume Management](docs/volume_management.md) capabilities as tools that can be utilized by AI agents.
78
+
74
79
### Workspace Management
75
80
76
81
The NetApp DataOps Toolkit can be used to manage data science workspaces within a Kubernetes cluster. Some of the key capabilities that the toolkit provides are the ability to provision a new JupyterLab workspace, the ability to almost instantaneously clone a JupyterLab workspace, and the ability to almost instantaneously save off a snapshot of a JupyterLab workspace for traceability/baselining.
# NetApp DataOps Toolkit for Kubernetes MCP Server
2
+
3
+
## Description
4
+
5
+
NetApp DataOps Toolkit MCP Server is an open-source, Python-based server component that makes your Kubernetes DataOps environments accessible via the Model Context Protocol (MCP). The MCP server enables standardized, interactive, and programmatic connectivity between DataOps resources and modern ML/ data platforms supporting the MCP standard.
6
+
7
+
>[!NOTE]
8
+
>This MCP server uses the stdio transport, as shown in the [MCP Server Quickstart](https://modelcontextprotocol.io/quickstart/server), making it a "local MCP server".
9
+
10
+
## Available Tools
11
+
12
+
The MCP server provides the following tools for managing JupyterLab workspaces and volumes in a Kubernetes environment:
13
+
14
+
### Workspace Management Tools
15
+
16
+
-**CreateJupyterLab**: Create a new JupyterLab workspace.
17
+
-**CloneJupyterLab**: Clone an existing JupyterLab workspace.
18
+
-**ListJupyterLabs**: List all JupyterLab workspaces.
19
+
-**CreateJupyterLabSnapshot**: Create a snapshot of a JupyterLab workspace.
20
+
-**ListJupyterLabSnapshots**: List all snapshots of JupyterLab workspaces.
21
+
22
+
### Volume Management Tools
23
+
24
+
-**CreateVolume**: Create a new volume.
25
+
-**CloneVolume**: Clone an existing volume.
26
+
-**ListVolumes**: List all volumes.
27
+
-**CreateVolumeSnapshot**: Create a snapshot of a volume.
28
+
-**ListVolumeSnapshots**: List all snapshots of volumes.
29
+
30
+
## Quick Start
31
+
32
+
### Prerequisites
33
+
34
+
- Python >= 3.10
35
+
-[uv](https://docs.astral.sh/uv/) or [pip](https://pypi.org/project/pip/)
36
+
- Access to a Kubernetes environment with NetApp Trident installed and configured. Refer to [the main README](../README.md) for full compatibility details.
37
+
38
+
To run the MCP tools from the MCP server, a valid kubeconfig file must be present on the local host. Refer to [the "Getting Started: Standard Usage" section from the main README](../README.md#getting-started) of the NetApp DataOps Toolkit for Kubernetes to learn more.
39
+
40
+
### Usage Instructions
41
+
42
+
#### Run with uv (recommended)
43
+
44
+
To run the MCP server using uv, run the following command. You do not need to install the NetApp DataOps Toolkit package before running this command.
To install the NetApp DataOps Toolkit for Kubernetes, run the following command.
53
+
54
+
```sh
55
+
python3 -m pip install netapp-dataops-k8s
56
+
```
57
+
58
+
After installation, the netapp_dataops_k8s_mcp.py command will be available in your PATH for direct usage.
59
+
60
+
#### Usage
61
+
62
+
##### Example JSON Config
63
+
64
+
To use the MCP server with an MCP client, you need to configure the client to use the server. For many clients (such as [VS Code](https://code.visualstudio.com/docs/copilot/chat/mcp-servers), [Claude Desktop](https://modelcontextprotocol.io/quickstart/user), and [AnythingLLM](https://docs.anythingllm.com/mcp-compatibility/overview)), this requires editing a config file that is in JSON format. Below is an example. Refer to the documentation for your MCP client for specific formatting details.
0 commit comments