You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## Purpose ##
* Support R4 transforms before R3. R3 requires hooking into the
attention module, where as R4 does not
## Prerequisites ##
* vllm-project/vllm#22486
## Testing ##
* Performed sanity checks with HF and vLLM
---------
Signed-off-by: Kyle Sayers <[email protected]>
0 commit comments