From ae32df1f834a50e34ddd0fae9ea9ea2f4b092eb4 Mon Sep 17 00:00:00 2001 From: Mergen Nachin Date: Sat, 9 Aug 2025 10:08:21 +0100 Subject: [PATCH] Add documentation for pytorch_tokenizer missing --- docs/source/llm/export-llm.md | 10 ++++++++++ docs/source/using-executorch-faqs.md | 7 +++++++ 2 files changed, 17 insertions(+) diff --git a/docs/source/llm/export-llm.md b/docs/source/llm/export-llm.md index 35f17a8aa72..462d9a51849 100644 --- a/docs/source/llm/export-llm.md +++ b/docs/source/llm/export-llm.md @@ -2,6 +2,16 @@ Instead of needing to manually write code to call torch.export(), use ExecuTorch's assortment of lowering APIs, or even interact with TorchAO quantize_ APIs for quantization, we have provided an out of box experience which performantly exports a selection of supported models to ExecuTorch. +## Prerequisites + +The LLM export functionality requires the `pytorch_tokenizers` package. If you encounter a `ModuleNotFoundError: No module named 'pytorch_tokenizers'` error, install it from the ExecutorTorch source code: + +```bash +pip install -e ./extension/llm/tokenizers/ +``` + +## Supported Models + As of this doc, the list of supported LLMs include the following: - Llama 2/3/3.1/3.2 - Qwen 2.5/3 diff --git a/docs/source/using-executorch-faqs.md b/docs/source/using-executorch-faqs.md index f639524d69c..d1bd0390569 100644 --- a/docs/source/using-executorch-faqs.md +++ b/docs/source/using-executorch-faqs.md @@ -14,6 +14,13 @@ sudo apt install python-dev ``` if you are using Ubuntu, or use an equivalent install command. +### ModuleNotFoundError: No module named 'pytorch_tokenizers' + +The `pytorch_tokenizers` package is required for LLM export functionality. Install it from the ExecutorTorch source code: +``` +pip install -e ./extension/llm/tokenizers/ +``` + ## Export ### Missing out variants: { _ }