Skip to content

Commit 76e5ac5

Browse files
authored
Merge pull request #31 from NetApp/release-v2.6.0
Release v2.6.0
2 parents 3b73ac0 + 6cc04a3 commit 76e5ac5

File tree

15 files changed

+1208
-173
lines changed

15 files changed

+1208
-173
lines changed

README.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,20 @@
11
NetApp DataOps Toolkit
22
=========
33

4-
The NetApp DataOps Toolkit is a Python-based tool that simplifies the management of development/training workspaces and inference servers that are backed by high-performance, scale-out NetApp storage. Key capabilities include:
5-
- Rapidly provision new high-capacity JupyterLab workspaces that are backed by high-performance, scale-out NetApp storage.
6-
- Rapidly provision new NVIDIA Triton Inference Server instances that are backed by enterprise-class NetApp storage.
7-
- Near-instantaneously clone high-capacity JupyterLab workspaces in order to enable experimentation or rapid iteration.
8-
- Near-instantaneously save snapshots of high-capacity JupyterLab workspaces for backup and/or traceability/baselining.
9-
- Near-instantaneously provision, clone, and snapshot high-capacity, high-performance data volumes.
4+
The NetApp DataOps Toolkit is a collection of Python-based client tools that simplify the management of data volumes and data science/engineering workspaces that are backed by high-performance, scale-out NetApp storage. Key capabilities include:
5+
- Rapidly provision new data volumes (file shares) or JupyterLab workspaces that are backed by high-performance, scale-out NetApp storage.
6+
- Near-instantaneously clone data volumes (file shares) or JupyterLab workspaces in order to enable experimentation or rapid iteration.
7+
- Near-instantaneously save snapshots of data volumes (file shares) or JupyterLab workspaces for backup and/or traceability/baselining.
8+
- Replicate data volumes (file shares) across different environments.
109

11-
## Getting Started
10+
The toolkit includes [MCP Servers](mcp_servers.md) that expose many of these capabilities as "tools" that can be utilized by AI agents.
1211

13-
The latest stable release of the NetApp DataOps Toolkit is version 2.5.0. It is recommended to always use the latest stable release. You can access the documentation for the latest stable release [here](https://github.com/NetApp/netapp-dataops-toolkit/tree/v2.5.0)
12+
## Getting Started
1413

15-
The NetApp DataOps Toolkit comes in two different flavors. For access to the most capabilities, we recommend using the [NetApp DataOps Toolkit for Kubernetes](netapp_dataops_k8s/). This flavor supports the full functionality of the toolkit, including JupyterLab workspace and NVIDIA Triton Inference Server management capabilities, but requires access to a Kubernetes cluster.
14+
The NetApp DataOps Toolkit includes the following client tools:
1615

17-
If you do not have access to a Kubernetes cluster, then you can use the [NetApp DataOps Toolkit for Traditional Environments](netapp_dataops_traditional/). However, this flavor only supports data volume management capabilities. It does not support the JupyterLab workspace and NVIDIA Triton Inference Server management capabilities that are available with the NetApp DataOps Toolkit for Kubernetes.
16+
- The [NetApp DataOps Toolkit for Kubernetes](netapp_dataops_k8s/) includes data volume management, JupyterLab management, and data movement capabilities for users that have access to a Kubernetes cluster.
17+
- The [NetApp DataOps Toolkit for Traditional Environments](netapp_dataops_traditional/) includes basic data volume management capabilities. It will run on most Linux and macOS clients, and does not require Kubernetes.
1818

1919
## Support
2020

mcp_servers.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# MCP Servers Included in the NetApp DataOps Toolkit
2+
3+
The NetApp DataOps Toolkit includes multiple MCP Servers. These MCP Servers expose DataOps Toolkit capabilities as "tools" that can be utilized by AI agents.
4+
5+
## NetApp DataOps Toolkit MCP Server for ONTAP
6+
7+
The [NetApp DataOps Toolkit MCP Server for ONTAP](netapp_dataops_traditional/docs/mcp_server.md) is an MCP Server that enables AI agents to manage volumes and snapshots on an ONTAP system, including NetApp AFF and FAS appliances, Amazon FSx for NetApp ONTAP instances, and NetApp Cloud Volumes ONTAP instances.
8+
9+
### Available Tools
10+
11+
- **Create Volume**: Rapidly provision new data volumes with customizable configurations.
12+
- **Clone Volume**: Create near-instantaneous, space-efficient clones of existing volumes using NetApp FlexClone technology.
13+
- **List Volumes**: Retrieve a list of all existing data volumes, with optional space usage details.
14+
- **Mount Volume**: Mount existing data volumes locally as read-only or read-write.
15+
- **Create Snapshot**: Create space-efficient, read-only copies of data volumes for versioning and traceability.
16+
- **List Snapshots**: Retrieve a list of all snapshots for a specific volume.
17+
- **Create SnapMirror Relationship**: Set up SnapMirror relationships for efficient data replication.
18+
- **List SnapMirror Relationships**: Retrieve a list of all SnapMirror relationships on the storage system.
19+
20+
## NetApp DataOps Toolkit for Kubernetes MCP Server
21+
22+
The [NetApp DataOps Toolkit for Kubernetes MCP Server](netapp_dataops_k8s/docs/mcp_server_k8s.md) is an MCP Server that enables AI agents to manage persistent volumes, volume snapshots, and JupyterLab workspaces within a Kubernetes cluster. This MCP server relies on the NetApp Trident CSI driver and is compatible with NetApp AFF and FAS appliances, Amazon FSx for NetApp ONTAP, Azure NetApp Files, Google Cloud NetApp Volumes, and NetApp Cloud Volumes ONTAP.
23+
24+
### Available Tools
25+
26+
This MCP Server provides the following tools for managing JupyterLab workspaces and volumes in a Kubernetes environment:
27+
28+
#### Workspace Management Tools
29+
30+
- **CreateJupyterLab**: Create a new JupyterLab workspace.
31+
- **CloneJupyterLab**: Clone an existing JupyterLab workspace.
32+
- **ListJupyterLabs**: List all JupyterLab workspaces.
33+
- **CreateJupyterLabSnapshot**: Create a snapshot of a JupyterLab workspace.
34+
- **ListJupyterLabSnapshots**: List all snapshots of JupyterLab workspaces.
35+
36+
#### Volume Management Tools
37+
38+
- **CreateVolume**: Create a new volume.
39+
- **CloneVolume**: Clone an existing volume.
40+
- **ListVolumes**: List all volumes.
41+
- **CreateVolumeSnapshot**: Create a snapshot of a volume.
42+
- **ListVolumeSnapshots**: List all snapshots of volumes.

netapp_dataops_k8s/README.md

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
NetApp DataOps Toolkit for Kubernetes
22
=========
33

4-
The NetApp DataOps Toolkit for Kubernetes is a Python library that makes it simple for developers, data scientists, DevOps engineers, and data engineers to perform various data management tasks within a Kubernetes cluster. Some of the key capabilities that the toolkit provides are the ability to provision a new persistent volume or data science workspace, the ability to almost instantaneously clone a volume or workspace, the ability to almost instantaneously save off a snapshot of a volume or workspace for traceability/baselining, and the ability to move data between S3 compatible object storage and a Kubernetes persistent volume.
4+
The NetApp DataOps Toolkit for Kubernetes is a Python library that makes it simple for developers, data scientists, DevOps engineers, and data engineers to perform various data management tasks within a Kubernetes cluster. Some of the key capabilities that the toolkit provides are the ability to provision a new persistent volume or data science workspace, the ability to almost instantaneously clone a volume or workspace, the ability to almost instantaneously save off a snapshot of a volume or workspace for traceability/baselining, and the ability to move data between S3 compatible object storage and a Kubernetes persistent volume. The toolkit also includes an [MCP Server](docs/mcp_server_k8s.md) that exposes many of the capabilities as "tools" that can be utilized by AI agents.
55

66
## Compatibility
77

@@ -15,16 +15,17 @@ The toolkit is currently compatible with Trident versions 20.07 and above. Addit
1515

1616
- ontap-nas
1717
- ontap-nas-flexgroup
18-
- gcp-cvs
1918
- azure-netapp-files
19+
- google-cloud-netapp-volume
20+
- gcp-cvs
2021

2122
The toolkit is currently compatible with all versions of the BeeGFS CSI driver, though not all functionality is supported by BeeGFS. Operations that are not supported by BeeGFS are noted within the documentation.
2223

2324
## Installation
2425

2526
### Prerequisites
2627

27-
The NetApp DataOps Toolkit for Kubernetes requires that Python 3.8, 3.9, 3.10, or 3.11 be installed on the local host. Additionally, the toolkit requires that pip for Python3 be installed on the local host. For more details regarding pip, including installation instructions, refer to the [pip documentation](https://pip.pypa.io/en/stable/installing/).
28+
The NetApp DataOps Toolkit for Kubernetes requires that Python 3.8, 3.9, 3.10, 3,11, 3.12, or 3.13 be installed on the local host. Additionally, the toolkit requires that pip for Python3 be installed on the local host. For more details regarding pip, including installation instructions, refer to the [pip documentation](https://pip.pypa.io/en/stable/installing/).
2829

2930
### Installation Instructions
3031

@@ -38,9 +39,9 @@ python3 -m pip install netapp-dataops-k8s
3839

3940
## Getting Started: Standard Usage
4041

41-
The NetApp DataOps Toolkit for Kubernetes can be utilized from any Linux or macOS host that has network access to the Kubernetes cluster.
42+
The NetApp DataOps Toolkit for Kubernetes can be utilized from any Linux or macOS client that has network access to the Kubernetes cluster.
4243

43-
The toolkit requires that a valid kubeconfig file be present on the local host, located at `$HOME/.kube/config` or at another path specified by the `KUBECONFIG` environment variable. Refer to the [Kubernetes documentation](https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/) for more information regarding kubeconfig files.
44+
The toolkit requires that a valid kubeconfig file be present on the client, located at `$HOME/.kube/config` or at another path specified by the `KUBECONFIG` environment variable. Refer to the [Kubernetes documentation](https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/) for more information regarding kubeconfig files.
4445

4546
## Getting Started: In-cluster Usage (for advanced Kubernetes users)
4647

@@ -71,6 +72,10 @@ Refer to the [Kubernetes documentation](https://kubernetes.io/docs/tasks/run-app
7172
7273
The NetApp DataOps Toolkit for Kubernetes provides the following capabilities.
7374
75+
### MCP Server
76+
77+
The NetApp DataOps Toolkit for Kubernetes includes an [MCP Server](docs/mcp_server_k8s.md) that exposes many of the [Workspace Management](docs/workspace_management.md) and [Volume Management](docs/volume_management.md) capabilities as tools that can be utilized by AI agents.
78+
7479
### Workspace Management
7580
7681
The NetApp DataOps Toolkit can be used to manage data science workspaces within a Kubernetes cluster. Some of the key capabilities that the toolkit provides are the ability to provision a new JupyterLab workspace, the ability to almost instantaneously clone a JupyterLab workspace, and the ability to almost instantaneously save off a snapshot of a JupyterLab workspace for traceability/baselining.
Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
# NetApp DataOps Toolkit for Kubernetes MCP Server
2+
3+
## Description
4+
5+
NetApp DataOps Toolkit MCP Server is an open-source, Python-based server component that makes your Kubernetes DataOps environments accessible via the Model Context Protocol (MCP). The MCP server enables standardized, interactive, and programmatic connectivity between DataOps resources and modern ML/ data platforms supporting the MCP standard.
6+
7+
>[!NOTE]
8+
>This MCP server uses the stdio transport, as shown in the [MCP Server Quickstart](https://modelcontextprotocol.io/quickstart/server), making it a "local MCP server".
9+
10+
## Available Tools
11+
12+
The MCP server provides the following tools for managing JupyterLab workspaces and volumes in a Kubernetes environment:
13+
14+
### Workspace Management Tools
15+
16+
- **CreateJupyterLab**: Create a new JupyterLab workspace.
17+
- **CloneJupyterLab**: Clone an existing JupyterLab workspace.
18+
- **ListJupyterLabs**: List all JupyterLab workspaces.
19+
- **CreateJupyterLabSnapshot**: Create a snapshot of a JupyterLab workspace.
20+
- **ListJupyterLabSnapshots**: List all snapshots of JupyterLab workspaces.
21+
22+
### Volume Management Tools
23+
24+
- **CreateVolume**: Create a new volume.
25+
- **CloneVolume**: Clone an existing volume.
26+
- **ListVolumes**: List all volumes.
27+
- **CreateVolumeSnapshot**: Create a snapshot of a volume.
28+
- **ListVolumeSnapshots**: List all snapshots of volumes.
29+
30+
## Quick Start
31+
32+
### Prerequisites
33+
34+
- Python >= 3.10
35+
- [uv](https://docs.astral.sh/uv/) or [pip](https://pypi.org/project/pip/)
36+
- Access to a Kubernetes environment with NetApp Trident installed and configured. Refer to [the main README](../README.md) for full compatibility details.
37+
38+
To run the MCP tools from the MCP server, a valid kubeconfig file must be present on the local host. Refer to [the "Getting Started: Standard Usage" section from the main README](../README.md#getting-started) of the NetApp DataOps Toolkit for Kubernetes to learn more.
39+
40+
### Usage Instructions
41+
42+
#### Run with uv (recommended)
43+
44+
To run the MCP server using uv, run the following command. You do not need to install the NetApp DataOps Toolkit package before running this command.
45+
46+
```sh
47+
uvx --from netapp-dataops-k8s netapp_dataops_k8s_mcp.py
48+
```
49+
50+
#### Install with pip and run from PATH
51+
52+
To install the NetApp DataOps Toolkit for Kubernetes, run the following command.
53+
54+
```sh
55+
python3 -m pip install netapp-dataops-k8s
56+
```
57+
58+
After installation, the netapp_dataops_k8s_mcp.py command will be available in your PATH for direct usage.
59+
60+
#### Usage
61+
62+
##### Example JSON Config
63+
64+
To use the MCP server with an MCP client, you need to configure the client to use the server. For many clients (such as [VS Code](https://code.visualstudio.com/docs/copilot/chat/mcp-servers), [Claude Desktop](https://modelcontextprotocol.io/quickstart/user), and [AnythingLLM](https://docs.anythingllm.com/mcp-compatibility/overview)), this requires editing a config file that is in JSON format. Below is an example. Refer to the documentation for your MCP client for specific formatting details.
65+
66+
```json
67+
{
68+
"mcpServers": {
69+
"netapp_dataops_k8s_mcp": {
70+
"type": "stdio",
71+
"command": "uvx",
72+
"args": [
73+
"--from",
74+
"netapp-dataops-k8s",
75+
"netapp_dataops_k8s_mcp.py"
76+
]
77+
}
78+
}
79+
}
80+
```

netapp_dataops_k8s/netapp_dataops/k8s/__init__.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1148,8 +1148,10 @@ def create_jupyter_lab_snapshot(workspace_name: str, snapshot_name: str = None,
11481148
if print_output:
11491149
print(
11501150
"Creating VolumeSnapshot for JupyterLab workspace '" + workspace_name + "' in namespace '" + namespace + "'...")
1151-
create_volume_snapshot(pvc_name=_get_jupyter_lab_workspace_pvc_name(workspaceName=workspace_name), snapshot_name=snapshot_name,
1151+
snapshot_name = create_volume_snapshot(pvc_name=_get_jupyter_lab_workspace_pvc_name(workspaceName=workspace_name), snapshot_name=snapshot_name,
11521152
volume_snapshot_class=volume_snapshot_class, namespace=namespace, print_output=print_output)
1153+
1154+
return snapshot_name
11531155

11541156

11551157
def create_k8s_config_map(name: str, data: dict, namespace: str = 'default', labels: dict = None,
@@ -1362,6 +1364,8 @@ def create_volume_snapshot(pvc_name: str, snapshot_name: str = None, volume_snap
13621364
if print_output:
13631365
print("Snapshot successfully created.")
13641366

1367+
return snapshot_name
1368+
13651369

13661370
def delete_jupyter_lab(workspace_name: str, namespace: str = "default", preserve_snapshots: bool = False,
13671371
print_output: bool = False):

0 commit comments

Comments
 (0)