Skip to content

Commit b5a2eb5

Browse files
committed
clarification
1 parent ea0d0a6 commit b5a2eb5

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

docs/install_maxtext.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -159,6 +159,8 @@ install_maxtext_github_deps
159159

160160
## Appendix: Install XPK for Multi-host Workloads
161161

162+
**Note:** XPK is only required for multi-host TPU configurations (e.g., v5p-128, v6e-256). For single-host training, XPK is not needed and you can run MaxText directly on your TPU VM.
163+
162164
XPK (Accelerated Processing Kit) is a tool designed to simplify the orchestration and management of workloads on Google Kubernetes Engine (GKE) clusters with TPU or GPU accelerators. In MaxText, we use XPK to submit both pre-training and post-training jobs on multi-host TPU configurations.
163165

164166
For your convenience, we provide a minimal installation path below:

0 commit comments

Comments
 (0)