Skip to content

Commit 4eb5571

Browse files
authored
Fix command for calling export_llm API (#15329)
### Summary [PLEASE REMOVE] See [CONTRIBUTING.md's Pull Requests](https://github.com/pytorch/executorch/blob/main/CONTRIBUTING.md#pull-requests) for ExecuTorch PR guidelines. [PLEASE REMOVE] If this PR closes an issue, please add a `Fixes #<issue-id>` line. [PLEASE REMOVE] If this PR introduces a fix or feature that should be the upcoming release notes, please add a "Release notes: <area>" label. For a list of available release notes labels, check out [CONTRIBUTING.md's Pull Requests](https://github.com/pytorch/executorch/blob/main/CONTRIBUTING.md#pull-requests). ### Test plan [PLEASE REMOVE] How did you test this PR? Please write down any manual commands you used and note down tests that you have written if applicable.
1 parent 37a65b5 commit 4eb5571

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/source/llm/export-llm.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ The up-to-date list of supported LLMs can be found in the code [here](https://gi
2626
`export_llm` is ExecuTorch's high-level export API for LLMs. In this tutorial, we will focus on exporting Llama 3.2 1B using this API. `export_llm`'s arguments are specified either through CLI args or through a yaml configuration whose fields are defined in [`LlmConfig`](https://github.com/pytorch/executorch/blob/main/extension/llm/export/config/llm_config.py). To call `export_llm`:
2727

2828
```
29-
python -m executorch.examples.extension.llm.export.export_llm
29+
python -m executorch.extension.llm.export.export_llm
3030
--config <path-to-config-yaml>
3131
+base.<additional-CLI-overrides>
3232
```

0 commit comments

Comments
 (0)