|
| 1 | +# Using Cursor as a client of transformers serve |
| 2 | + |
| 3 | +This example shows how to use `transformers serve` as a local LLM provider for [Cursor](https://cursor.com/), the popular IDE. In this particular case, requests to `transformers serve` will come from an external IP (Cursor's server IPs), which requires some additional setup. Furthermore, some of Cursor's requests require [CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/Guides/CORS), which is disabled by default for security reasons. |
| 4 | + |
| 5 | +To launch a server with CORS enabled, run |
| 6 | + |
| 7 | +```shell |
| 8 | +transformers serve --enable-cors |
| 9 | +``` |
| 10 | + |
| 11 | +You'll also need to expose your server to external IPs. A potential solution is to use [`ngrok`](https://ngrok.com/), which has a permissive free tier. After setting up your `ngrok` account and authenticating on your server machine, you run |
| 12 | + |
| 13 | +```shell |
| 14 | +ngrok http [port] |
| 15 | +``` |
| 16 | + |
| 17 | +where `port` is the port used by `transformers serve` (`8000` by default). On the terminal where you launched `ngrok`, you'll see a https address in the "Forwarding" row, as in the image below. This is the address to send requests to. |
| 18 | + |
| 19 | +<h3 align="center"> |
| 20 | + <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/transformers_serve_ngrok.png"/> |
| 21 | +</h3> |
| 22 | + |
| 23 | +You're now ready to set things up on the app side! In Cursor, while you can't set a new provider, you can change the endpoint for OpenAI requests in the model selection settings. First, navigate to "Settings" > "Cursor Settings", "Models" tab, and expand the "API Keys" collapsible. To set your `transformers serve` endpoint, follow this order: |
| 24 | +1. Unselect ALL models in the list above (e.g. `gpt4`, ...); |
| 25 | +2. Add and select the model you want to use (e.g. `Qwen/Qwen3-4B`) |
| 26 | +3. Add some random text to OpenAI API Key. This field won't be used, but it can’t be empty; |
| 27 | +4. Add the https address from `ngrok` to the "Override OpenAI Base URL" field, appending `/v1` to the address (i.e. `https://(...).ngrok-free.app/v1`); |
| 28 | +5. Hit "Verify". |
| 29 | + |
| 30 | +After you follow these steps, your "Models" tab should look like the image below. Your server should also have received a few requests from the verification step. |
| 31 | + |
| 32 | +<h3 align="center"> |
| 33 | + <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/transformers_serve_cursor.png"/> |
| 34 | +</h3> |
| 35 | + |
| 36 | +You are now ready to use your local model in Cursor! For instance, if you toggle the AI Pane, you can select the model you added and ask it questions about your local files. |
| 37 | + |
| 38 | +<h3 align="center"> |
| 39 | + <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/transformers_serve_cursor_chat.png"/> |
| 40 | +</h3> |
| 41 | + |
| 42 | + |
0 commit comments