Commit d4c9f8b
committed
Update base for Update on "Refactor attention v2"
Pull attention creation out of Transformer/TransformerBlock. Instead, pass the layers into Transformer.
The motivation is to customize linear layers in attention for LoRA (eg. make wq into a LoraLinear instead of a regular linear). In the next diff (D73517350), we pull wq,wk,wv,wo out of the attention and pass those in as well.
This allows us to customize attention parameters without passing in ModelArgs and doing the customization deep inside attention.py.
I think this modularizes our attention/transformer components, though also means that users have to do some more work to construct the attention layers and pass it to transformer.
It follows the torchtune structure more closely, eg. https://github.com/pytorch/torchtune/blob/main/torchtune/models/llama3_2/_component_builders.py#L221
Differential Revision: [D73538697](https://our.internmc.facebook.com/intern/diff/D73538697/)
[ghstack-poisoned]File tree
66 files changed
+3757
-445
lines changed- backends
- apple/mps
- arm
- operator_support
- operators
- cadence
- aot
- hifi/operators
- vulkan
- runtime/graph/ops
- glsl
- impl
- test/op_tests
- xnnpack
- operators
- quantizer
- test/quantizer
- docs/source
- examples
- apple/mps
- executor_runner
- scripts
- demo-apps
- android/LlamaDemo/app/src/main/java/com/example/executorchllamademo
- apple_ios/LLaMA
- LLaMA.xcodeproj
- LLaMA/Application
- models/llama
- exir
- passes
- tests
- extension
- android/jni
- apple/ExecuTorch
- Exported
- __tests__
- benchmark/apple/Benchmark
- Benchmark.xcodeproj
- flat_tensor
- kernels
- optimized/cpu
- portable/cpu
- test
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
66 files changed
+3757
-445
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
96 | 96 | | |
97 | 97 | | |
98 | 98 | | |
| 99 | + | |
99 | 100 | | |
100 | | - | |
| 101 | + | |
| 102 | + | |
101 | 103 | | |
102 | 104 | | |
103 | 105 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
76 | 76 | | |
77 | 77 | | |
78 | 78 | | |
79 | | - | |
| 79 | + | |
80 | 80 | | |
81 | 81 | | |
82 | 82 | | |
83 | 83 | | |
84 | | - | |
| 84 | + | |
85 | 85 | | |
86 | 86 | | |
87 | 87 | | |
| |||
118 | 118 | | |
119 | 119 | | |
120 | 120 | | |
121 | | - | |
| 121 | + | |
122 | 122 | | |
123 | 123 | | |
124 | 124 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
11 | 11 | | |
12 | 12 | | |
13 | 13 | | |
14 | | - | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
15 | 19 | | |
16 | 20 | | |
17 | 21 | | |
| |||
43 | 47 | | |
44 | 48 | | |
45 | 49 | | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
46 | 53 | | |
47 | 54 | | |
48 | 55 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
4 | 4 | | |
5 | 5 | | |
6 | 6 | | |
7 | | - | |
| 7 | + | |
8 | 8 | | |
9 | 9 | | |
10 | 10 | | |
11 | 11 | | |
12 | | - | |
13 | 12 | | |
14 | 13 | | |
15 | 14 | | |
| |||
33 | 32 | | |
34 | 33 | | |
35 | 34 | | |
36 | | - | |
| 35 | + | |
37 | 36 | | |
38 | 37 | | |
39 | 38 | | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
40 | 42 | | |
41 | 43 | | |
42 | 44 | | |
| |||
53 | 55 | | |
54 | 56 | | |
55 | 57 | | |
56 | | - | |
| 58 | + | |
57 | 59 | | |
58 | 60 | | |
59 | 61 | | |
| |||
96 | 98 | | |
97 | 99 | | |
98 | 100 | | |
99 | | - | |
| 101 | + | |
100 | 102 | | |
101 | 103 | | |
102 | 104 | | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
103 | 108 | | |
104 | 109 | | |
105 | 110 | | |
| |||
129 | 134 | | |
130 | 135 | | |
131 | 136 | | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
| 175 | + | |
| 176 | + | |
| 177 | + | |
| 178 | + | |
| 179 | + | |
| 180 | + | |
| 181 | + | |
| 182 | + | |
| 183 | + | |
| 184 | + | |
| 185 | + | |
| 186 | + | |
| 187 | + | |
| 188 | + | |
| 189 | + | |
| 190 | + | |
| 191 | + | |
| 192 | + | |
| 193 | + | |
| 194 | + | |
| 195 | + | |
| 196 | + | |
| 197 | + | |
| 198 | + | |
| 199 | + | |
| 200 | + | |
| 201 | + | |
| 202 | + | |
| 203 | + | |
| 204 | + | |
| 205 | + | |
| 206 | + | |
| 207 | + | |
| 208 | + | |
| 209 | + | |
| 210 | + | |
| 211 | + | |
| 212 | + | |
| 213 | + | |
| 214 | + | |
| 215 | + | |
| 216 | + | |
| 217 | + | |
| 218 | + | |
| 219 | + | |
| 220 | + | |
| 221 | + | |
| 222 | + | |
| 223 | + | |
| 224 | + | |
| 225 | + | |
| 226 | + | |
| 227 | + | |
| 228 | + | |
| 229 | + | |
| 230 | + | |
| 231 | + | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
| 235 | + | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| 251 | + | |
| 252 | + | |
| 253 | + | |
| 254 | + | |
| 255 | + | |
0 commit comments