Skip to content

Commit a726c0b

Browse files
committed
refactor
1 parent 97b02c6 commit a726c0b

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

program-data-separation/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# Program Data Separation Examples
22

33
This directory provides an example of the Program Data Separation APIs in ExecuTorch. Specifically, it showcases:
4-
1. Simple program data separation examples using the portable operators and XNNPACK.
4+
1. Program data separation examples using a linear model with the portable operators and XNNPACK.
55
2. LoRA inference example with a LoRA and non-LoRA model sharing foundation weights.
66

77
## Program Data Separation
@@ -15,10 +15,10 @@ PTD files are used to store data outside of the PTE file. Some use-cases:
1515

1616
For more information on the PTD data format, please see the [flat_tensor](https://github.com/pytorch/executorch/blob/main/extension/flat_tensor/README.md) directory.
1717

18-
## Export a model with program-data separation
18+
## Linear example
1919
For a demo of the program-data separation APIs using a linear model, please see [program-data-separation/cpp/linear_example](linear_example/). This example generates and runs a program-data separated linear model, with weights and bias in a separate .ptd file.
2020

21-
## Export a model with LoRA
21+
## LoRA example
2222
A major use-case that program-data separation enables is inference with multiple LoRA adapters. LoRA is a fine-tuning technique introduced in [LoRA: Low-Rank Adaptation of Large Language Models](https://arxiv.org/abs/2106.09685). LoRA fine-tuning produces lightweight 'adapter' weights that can be applied to an existing model to adapt it to a new task. LoRA adapters are typically small in comparison to LLM foundation weights, on the order of KB-MB depending on the finetuning setup and model size.
2323

2424
To enable LoRA, we generate:

0 commit comments

Comments
 (0)