Skip to content
Discussion options

You must be logged in to vote

已解决。
原因是自动安装的transformers(4.40.0) 和 torch(2.2.2)版本太高,自动调用FlashAttention ,将版本分别降到4.31.0和2.0.1问题解决。

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by aizongabc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant