You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m trying to use your model V1-7B-sft-s3-reasoning in Colab/other Python environments.
During the loading process, I encountered the following error:
"It seems that the fla package, which contains the implementations for ShortConvolution, GatedLinearAttention, and RMSNorm, is not publicly available."
Could you please consider making the fla package publicly available, or provide instructions on how to install it, so that the model can be run outside your internal environment?
Thank you for your work and for sharing your research!