Commit 0e547c1
Remove default bos/eos from metadata (pytorch#15231)
Summary:
See: pytorch#15215
Currently:
- default eos/bos tokens are embedded into the pte
- llama3 instruct has a different set of eos/bos tokens
- users must manually specify at export time the llama3 instruct eos/bos tokens, because the runner overrides tokenizer eos/bos with the values in the PTE
This diff:
- removes the defaults
- rely on tokenizer for eos/bos UNLESS the user explicitly specifies in the metadata, in which case use the eos/bos saved in PTE.
Reviewed By: jackzhxng
Differential Revision: D849427181 parent aeee757 commit 0e547c1
1 file changed
+0
-11
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
15 | 15 | | |
16 | 16 | | |
17 | 17 | | |
18 | | - | |
19 | 18 | | |
20 | 19 | | |
21 | 20 | | |
| |||
121 | 120 | | |
122 | 121 | | |
123 | 122 | | |
124 | | - | |
125 | | - | |
126 | | - | |
127 | | - | |
128 | | - | |
129 | 123 | | |
130 | 124 | | |
131 | 125 | | |
| |||
1247 | 1241 | | |
1248 | 1242 | | |
1249 | 1243 | | |
1250 | | - | |
1251 | 1244 | | |
1252 | 1245 | | |
1253 | 1246 | | |
| |||
1257 | 1250 | | |
1258 | 1251 | | |
1259 | 1252 | | |
1260 | | - | |
1261 | 1253 | | |
1262 | | - | |
1263 | | - | |
1264 | 1254 | | |
1265 | 1255 | | |
1266 | 1256 | | |
| |||
1332 | 1322 | | |
1333 | 1323 | | |
1334 | 1324 | | |
1335 | | - | |
1336 | 1325 | | |
1337 | 1326 | | |
1338 | 1327 | | |
| |||
0 commit comments