Authentication token for HuggingFaceTextGenInference #7323
Replies: 2 comments 1 reply
-
Answer generated by a 🤖 AnswerI understand that you're having trouble authenticating with the HuggingFace API when using the Here's a potential solution: You can pass your HuggingFace API token directly to the llm = HuggingFaceTextGenInference(
inference_server_url=server_url,
huggingfacehub_api_token=HUGGING_FACE_API_KEY,
max_new_tokens=2048,
top_k=10,
top_p=0.95,
typical_p=0.95,
temperature=0.01,
repetition_penalty=1.1) In the Please note that this is one potential solution based on the information you've provided. If this doesn't resolve your issue, there may be other factors at play that we'll need to investigate further. If this solution works for you and you feel that this information is missing or not clearly stated in the existing documentation, we encourage you to contribute to the LangChain project by making a pull request to update the documentation. This will help other users who might encounter the same issue in the future. Sources
This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
-
I finally got the answer. headers = {'authorization': f'Bearer {api_auth_key}'} # your own api key
llm = HuggingFaceTextGenInference(
inference_server_url="https://xyz.endpoints.huggingface.cloud", # Protected
max_new_tokens=256,
top_k=10,
top_p=0.95,
typical_p=0.95,
temperature=0.01,
repetition_penalty=1.1,
server_kwargs={"headers": headers}) Hope this helps. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I have setup a Huggingface inference endpoint here, in protected mode, in an orgainzation.
And in langchain, i am uisng the following code to get the llm:
However, i got the following error when trying to inference the model
text_generation.errors.UnknownError: A valid user token is required
The problem is that, i donlt know where to input the user token. I have tried to following:
Using
huggingface-cli login
Using
Using
Beta Was this translation helpful? Give feedback.
All reactions