Skip to content

Commit c247bda

Browse files
committed
Up.
1 parent 1b339c7 commit c247bda

File tree

3 files changed

+25
-23
lines changed

3 files changed

+25
-23
lines changed

docs/inference-providers/_toctree.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@
2323
title: How to use OpenAI gpt-oss
2424
- local: guides/image-editor
2525
title: Build an Image Editor
26-
- local: guides/vscode-copilot
27-
title: VS Code Copilot
26+
- local: guides/vscode
27+
title: VS Code with GitHub Copilot
2828

2929
- local: tasks/index
3030
title: Inference Tasks

docs/inference-providers/guides/vscode-copilot.md

Lines changed: 0 additions & 21 deletions
This file was deleted.
Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# 🤗 Hugging Face Inference Providers for VS Code Copilot
2+
3+
![Demo](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/demo.gif)
4+
5+
You can now use SoTA open‑source LLMs like Kimi K2, DeepSeek V3.1, GLM 4.5 and more in VS Code with GitHub Copilot Chat with [Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index) 🔥
6+
7+
## ⚡ Quick start
8+
9+
1. Install the HF Copilot Chat extension [here](https://marketplace.visualstudio.com/items?itemName=HuggingFace.huggingface-vscode-chat ).
10+
2. Open VS Code's chat interface.
11+
3. Click the model picker and click "Manage Models...".
12+
4. Select "Hugging Face" provider.
13+
5. Provide your Hugging Face Token, you can get one in your [settings page](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained).
14+
6. Choose the models you want to add to the model picker. 🥳
15+
16+
## ✨ Why use the Hugging Face provider in Copilot
17+
18+
- Access [SoTA frontier open‑source LLMs](https://huggingface.co/models?pipeline_tag=text-generation&inference_provider=cerebras,together,fireworks-ai,nebius,novita,sambanova,groq,hyperbolic,nscale,fal-ai,cohere,replicate,scaleway,black-forest-labs,ovhcloud&sort=trending) with tool calling capabilities.
19+
- Single API to thousands of open‑source LLMs via providers like Groq, Cerebras, Together AI, SambaNova, and more.
20+
- Built for high availability (across providers) and low latency through world‑class providers.
21+
- Transparent pricing: what the provider charges is what you pay.
22+
23+
💡 The free tier gives you a small amount of monthly inference credits to experiment. Upgrade to [Hugging Face PRO](https://huggingface.co/pro) or [Enterprise](https://huggingface/enterprise) for $2 in monthly credits plus pay‑as‑you‑go access across all providers!

0 commit comments

Comments
 (0)