Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion docs/inference-providers/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@
title: How to use OpenAI gpt-oss
- local: guides/image-editor
title: Build an Image Editor
- local: guides/vscode-copilot
title: VS Code Copilot

- local: tasks/index
title: Inference Tasks
Expand Down Expand Up @@ -106,4 +108,4 @@
title: Hub API

- local: register-as-a-provider
title: Register as an Inference Provider
title: Register as an Inference Provider
23 changes: 23 additions & 0 deletions docs/inference-providers/guides/vscode-copilot.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# 🤗 Hugging Face Inference Providers for VS Code Copilot

![Demo](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/demo.gif)

<!-- Place demo.gif at docs/inference-providers/guides/assets/demo.gif -->

Bring SoTA open‑source LLMs to VS Code Copilot Chat with [Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index), built on the [Language Model Chat Provider API](https://code.visualstudio.com/api/extension-guides/ai/language-model-chat-provider).

## ⚡ Quick start
1. Install the HF Copilot Chat extension [here](#todo).
1. Open VS Code's chat interface.
2. Click the model picker and click "Manage Models...".
3. Select "Hugging Face" provider.
4. Provide your Hugging Face Token, you can get one in your [settings page](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained).
5. Select the models you want to add to the model picker.

## ✨ Why use the Hugging Face provider in Copilot
- 4k+ open‑source LLMs with tool calling capabilities.
- Single API to thousands of open‑source LLMs via providers like Groq, Cerebras, Together AI, SambaNova, and more.
- Built for high availability (across providers) and low latency through world‑class providers.
- No extra markup on provider rates.

💡 The free tier gives you a small amount of monthly inference credits to experiment. Upgrade to [Hugging Face PRO](https://huggingface.co/pro) or [Enterprise](https://huggingface/enterprise) for $2 in monthly credits plus pay‑as‑you‑go access across all providers!