You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This guide shows you how to use AIKit to package AI models as OCI artifacts using the ModelPack specification.
4
+
5
+
## What is AIKit?
6
+
7
+
[AIKit](https://kaito-project.github.io/aikit/docs/) is a comprehensive platform to quickly get started to host, deploy, build and fine-tune large language models (LLMs). AIKit also provides packaging models as OCI artifacts for distribution through any OCI-compliant registry.
8
+
9
+
## Prerequisites
10
+
11
+
- Docker with [BuildKit](https://docs.docker.com/build/buildkit/) support
12
+
-[ORAS](https://oras.land/docs/installation) or [Skopeo](https://github.com/containers/skopeo/blob/main/install.md) for pushing to registries
13
+
14
+
## Package a Model
15
+
16
+
AIKit uses Docker BuildKit to package models from various sources (local files, HTTP/HTTPS, or Hugging Face).
For more packaging options including compression modes, layer categorization, and exclusions, see the [AIKit packaging documentation](https://kaito-project.github.io/aikit/docs/packaging).
34
+
35
+
## Push to a Registry
36
+
37
+
Use ORAS or Skopeo to push the OCI layout to a remote registry:
Copy file name to clipboardExpand all lines: docs/getting-started.md
+9-46Lines changed: 9 additions & 46 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -58,56 +58,19 @@ This section lists the core infrastructure components that ModelPack is working
58
58
- Basic understanding of containers and OCI concepts
59
59
- Access to an OCI-compatible registry (Docker Hub, Harbor, etc.)
60
60
61
-
### Practical Steps
61
+
### Choose Your Tool
62
62
63
-
#### 1. Install modctl
63
+
The ModelPack specification can be used with different tools depending on your needs:
64
64
65
-
Follow the instructions in the [modctl GitHub repository](https://github.com/modelpack/modctl/blob/main/docs/getting-started.md#installation) to install the CLI tool.
65
+
-**[modctl](./modctl.md)**: CLI tool for building, pushing, pulling, and managing OCI model artifacts. Great for command-line workflows and CI/CD pipelines.
66
+
-**[AIKit](./aikit.md)**: Package AI models as OCI artifacts from local, HTTP, or Hugging Face sources with extensible formats.
67
+
-**[KitOps](https://kitops.ml/)**: ModelKit packaging and deployment platform that supports the ModelPack specification.
66
68
67
-
#### 2. Install Model CSI Driver
69
+
### Install Model CSI Driver
68
70
69
71
If you plan to use models in Kubernetes, install the Model CSI Driver by following the instructions in the [Model CSI Driver repository](https://github.com/modelpack/model-csi-driver/blob/main/docs/getting-started.md#helm-installation).
70
72
71
-
#### 3. Download A Model
72
-
73
-
To package a model, you need to download it to your local directory. The following example shows how to download a model from Huggingface.
74
-
75
-
```bash
76
-
export HF_MODEL="Qwen/Qwen3-0.6B"
77
-
export MODEL_PATH=my-model-directory
78
-
79
-
# Install the huggingface cli
80
-
pip install 'huggingface_hub'
81
-
82
-
# Login the huggingface cli
83
-
hf auth login --token <your-huggingface-token>
84
-
85
-
# Download a model
86
-
hf download $HF_MODEL --local-dir $MODEL_PATH
87
-
```
88
-
89
-
#### 4. Package Your First Model
90
-
91
-
The following script will walk through how to build a ModelPack format model artifact and push it to the model registry.
92
-
93
-
```bash
94
-
# Please modify the MODEL_REGISTRY environment variable to point to your OCI model registry
95
-
export MODEL_REGISTRY=myregistry.com
96
-
97
-
# If $MODEL_REGISTRY needs authentication, please login first
Here's an example Kubernetes pod spec that mounts a model artifact using the model CSI driver. The model will be available under the `/model` directory inside the container.
113
76
@@ -137,8 +100,8 @@ This example shows how to mount a model artifact directly into a Kubernetes pod
137
100
138
101
## Next Steps
139
102
140
-
1. **Explore the [full ModelPack specification](./spec.md)** for technical implementation details
141
-
2. **Try more options with the [modctl tool](https://github.com/modelpack/modctl)** for additional hands-on experience
103
+
1. **Get hands-on experience**: Follow the step-by-step guides for [modctl](./modctl.md) or [AIKit](./aikit.md)
104
+
2. **Explore the [full ModelPack specification](./spec.md)** for technical implementation details
142
105
3. **Join the community** on [CNCF Slack #modelpack](https://cloud-native.slack.com/archives/C07T0V480LF)
143
106
4. **Contribute** to the ModelPack project - see our [contributing guidelines](../CONTRIBUTING.md)
This guide shows you how to use `modctl` to package, distribute, and manage AI models using the ModelPack specification.
4
+
5
+
## Installation
6
+
7
+
Follow the instructions to install `modctl` from the [modctl GitHub repository](https://github.com/modelpack/modctl/blob/main/docs/getting-started.md#installation) to install the CLI tool.
8
+
9
+
## Download A Model
10
+
11
+
To package a model, you need to download it to your local directory. The following example shows how to download a model from Huggingface.
12
+
13
+
```bash
14
+
export HF_MODEL="Qwen/Qwen3-0.6B"
15
+
export MODEL_PATH=my-model-directory
16
+
17
+
# Install the huggingface cli
18
+
pip install 'huggingface_hub'
19
+
20
+
# Login the huggingface cli
21
+
hf auth login --token <your-huggingface-token>
22
+
23
+
# Download a model
24
+
hf download $HF_MODEL --local-dir $MODEL_PATH
25
+
```
26
+
27
+
## Package Your First Model
28
+
29
+
The following script will walk through how to build a ModelPack format model artifact and push it to the model registry.
30
+
31
+
```bash
32
+
# Please modify the MODEL_REGISTRY environment variable to point to your OCI model registry
33
+
export MODEL_REGISTRY=myregistry.com
34
+
35
+
# If $MODEL_REGISTRY needs authentication, please login first
0 commit comments