You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/inference-providers/register-as-a-provider.md
+36Lines changed: 36 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -124,6 +124,7 @@ Congratulations! You now have a JS implementation to successfully make inference
124
124
First step is to use the Model Mapping API to register which HF models are supported.
125
125
126
126
> [!TIP]
127
+
> The completion of step 1. and 2. are pre-requisites for this step.
127
128
> To proceed with this step, we have to enable your account server-side. Make sure you have an organization on the Hub for your company, and upgrade it to a [Team or Enterprise plan](https://huggingface.co/pricing).
### Exposing pricing through OpenAI /models routes
400
+
401
+
If your API is OpenAI-compatible, we expect that you expose LLM pricing information and context length through the [`/v1/models` endpoint](https://platform.openai.com/docs/api-reference/models/list).
402
+
403
+
This powers our [provider comparison table](https://huggingface.co/inference/models) and other provider selection features like `:cheapest` (which selects the cheapest provider for a model).
0 commit comments