Skip to content

Commit 9fc8c47

Browse files
committed
amend
1 parent b5a2eb5 commit 9fc8c47

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/install_maxtext.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,7 @@ install_maxtext_github_deps
159159

160160
## Appendix: Install XPK for Multi-host Workloads
161161

162-
**Note:** XPK is only required for multi-host TPU configurations (e.g., v5p-128, v6e-256). For single-host training, XPK is not needed and you can run MaxText directly on your TPU VM.
162+
> **_NOTE:_** XPK is only required for multi-host TPU configurations (e.g., v5p-128, v6e-256). For single-host training, XPK is not needed and you can run MaxText directly on your TPU VM.
163163
164164
XPK (Accelerated Processing Kit) is a tool designed to simplify the orchestration and management of workloads on Google Kubernetes Engine (GKE) clusters with TPU or GPU accelerators. In MaxText, we use XPK to submit both pre-training and post-training jobs on multi-host TPU configurations.
165165

0 commit comments

Comments
 (0)