Skip to content

Llama 3.2 rope scale factor is hardcoded to 8 #7265

@mergennachin

Description

@mergennachin

🐛 Describe the bug

Between Llama 3.1 and 3.2, rope_scale factor had increased from 8 to 32.

https://huggingface.co/.../Llama-3.2-3B.../discussions/1
https://huggingface.co/.../f6dcb8adea08358576f9a68d351035...

But our code is still hard-coded to https://github.com/pytorch/executorch/blob/main/examples/models/llama/rope.py#L20

Versions

N/A

Metadata

Metadata

Assignees

Labels

actionableItems in the backlog waiting for an appropriate impl/fixmodule: llmIssues related to LLM examples and apps, and to the extensions/llm/ codetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions