-
Notifications
You must be signed in to change notification settings - Fork 156
Closed
Description
| from flash_attn import flash_attn_func |
This import is not needed,
SpecForge/specforge/modeling/draft/llama3_eagle.py
Lines 27 to 32 in ed87efa
| try: | |
| from flash_attn import flash_attn_func | |
| except: | |
| warnings.warn( | |
| "flash_attn is not found, please install flash_attn if you want to use the flash attention backend" | |
| ) |
As this import exists.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels