integrate huggingface inference endpoint with flowise issue🆘 #2042
Unanswered
For-Pubilc-uses
asked this question in
Q&A
Replies: 1 comment
-
which chain are you using? here's a quick video about using HF models on FW - https://youtu.be/rGehI5PNP2o?si=v3t9qo2Fa8jy6nc5 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to integrate model: mistralai/Mixtral-8x7B-Instruct-v0.1 from hugging face which I have deployed as an inference endpoint already, and I got a URL that i can put into flowise Add Endpoint URL
but it just doesn't work as expected, when i try to run the model, it shows that Error,so I just set the max_token to 250, now I can generate the text but I still the model works randomly with no expected output Model Output
it is supposed to be a translator that translates everything into English, this is how the model will behave when I used free inference API from hugging face: Behaved Correctly
there's a limit for free inference API, that's why I paid for the inference endpoint, but it just behaved so differently unlike the free inference api, so confused, please help me out if anyone knows, thanks so much!
Beta Was this translation helpful? Give feedback.
All reactions