Commit 2c87145
authored
Pin transformers version to 4.53.1 to avoid breakage
To avoid this error:
```
Traceback (most recent call last):
File "/opt/conda/envs/py_3.10/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/opt/conda/envs/py_3.10/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/executorch/examples/models/phi-3-mini/export_phi-3-mini.py", line 168, in <module>
main()
File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/executorch/examples/models/phi-3-mini/export_phi-3-mini.py", line 164, in main
export(parser.parse_args())
File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/executorch/examples/models/phi-3-mini/export_phi-3-mini.py", line 79, in export
exportable_module = TorchExportableModuleForDecoderOnlyLM(
File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/integrations/executorch.py", line 67, in __init__
self.model = TorchExportableModuleWithStaticCache(model)
File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/integrations/executorch.py", line 293, in __init__
max_batch_size=self.model.generation_config.cache_config.get("batch_size"),
AttributeError: 'StaticCacheConfig' object has no attribute 'get'
```1 parent 1c72e0e commit 2c87145
1 file changed
+1
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
4 | 4 | | |
5 | 5 | | |
6 | 6 | | |
7 | | - | |
| 7 | + | |
0 commit comments