You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: awesome-gpt-oss.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@ This is a list of guides and resources to help you get started with the gpt-oss
41
41
-[Optimizing gpt-oss with NVIDIA TensorRT-LLM](https://cookbook.openai.com/articles/run-nvidia)
42
42
-[Deploying gpt-oss on TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM/blob/main/docs/source/blogs/tech_blog/blog9_Deploying_GPT_OSS_on_TRTLLM.md)
43
43
- AMD
44
-
-[Running the Latest Open Models from OpenAI on AMD AI Hardware](https://rocm.blogs.amd.com/ecosystems-and-partners/openai-day-0/README.html)
44
+
-[Running the Latest Open Models from OpenAI on AMD AI Hardware](https://rocm.blogs.amd.com/ecosystems-and-partners/openai-day-0/README.html)
45
45
46
46
### Cloud
47
47
@@ -50,18 +50,18 @@ This is a list of guides and resources to help you get started with the gpt-oss
50
50
-[gpt-oss-120b model on the GroqCloud Playground](https://console.groq.com/playground?model=openai/gpt-oss-120b)
51
51
-[gpt-oss-20b model on the GroqCloud Playground](https://console.groq.com/playground?model=openai/gpt-oss-20b)
52
52
-[gpt-oss with built-in web search on GroqCloud](https://console.groq.com/docs/browser-search)
53
-
-[gpt-oss with built-in code execution on GroqCloud](https://console.groq.com/docs/code-execution)
53
+
-[gpt-oss with built-in code execution on GroqCloud](https://console.groq.com/docs/code-execution)
54
54
-[Responses API on Groq](https://console.groq.com/docs/responses-api)
55
55
- NVIDIA
56
56
-[NVIDIA launch blog post](https://blogs.nvidia.com/blog/openai-gpt-oss/)
57
57
-[NVIDIA & gpt-oss developer launch blog post](https://developer.nvidia.com/blog/delivering-1-5-m-tps-inference-on-nvidia-gb200-nvl72-nvidia-accelerates-openai-gpt-oss-models-from-cloud-to-edge/)
58
58
- Use [gpt-oss-120b](https://build.nvidia.com/openai/gpt-oss-120b) and [gpt-oss-20b](https://build.nvidia.com/openai/gpt-oss-20b) on NVIDIA's Cloud
59
59
- Cloudflare
60
-
-[Cloudflare & gpt-oss launch blog post](http://blog.cloudflare.com/openai-gpt-oss-on-workers-ai)
60
+
-[Cloudflare & gpt-oss launch blog post](https://blog.cloudflare.com/openai-gpt-oss-on-workers-ai)
61
61
-[gpt-oss-120b on Cloudflare Workers AI](https://developers.cloudflare.com/workers-ai/models/gpt-oss-120b)
62
62
-[gpt-oss-20b on Cloudflare Workers AI](https://developers.cloudflare.com/workers-ai/models/gpt-oss-20b)
63
63
- AMD
64
-
-[gpt-oss-120B on AMD MI300X](https://huggingface.co/spaces/amd/gpt-oss-120b-chatbot)
64
+
-[gpt-oss-120B on AMD MI300X](https://huggingface.co/spaces/amd/gpt-oss-120b-chatbot)
Copy file name to clipboardExpand all lines: gpt-oss-mcp-server/README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
# MCP Servers for gpt-oss reference tools
2
2
3
3
This directory contains MCP servers for the reference tools in the [gpt-oss](https://github.com/openai/gpt-oss) repository.
4
-
You can set up these tools behind MCP servers and use them in your applications.
5
-
For inference service that integrates with MCP, you can also use these as reference tools.
4
+
You can set up these tools behind MCP servers and use them in your applications.
5
+
For inference service that integrates with MCP, you can also use these as reference tools.
6
6
7
7
In particular, this directory contains a `build-system-prompt.py` script that will generate exactly the same system prompt as `reference-system-prompt.py`.
8
8
The build system prompt script show case all the care needed to automatically discover the tools and construct the system prompt before feeding it into Harmony.
@@ -22,8 +22,8 @@ mcp run -t sse browser_server.py:mcp
22
22
mcp run -t sse python_server.py:mcp
23
23
```
24
24
25
-
You can now use MCP inspector to play with the tools.
25
+
You can now use MCP inspector to play with the tools.
26
26
Once opened, set SSE to `http://localhost:8001/sse` and `http://localhost:8000/sse` respectively.
27
27
28
-
To compare the system prompt and see how to construct it via MCP service discovery, see `build-system-prompt.py`.
28
+
To compare the system prompt and see how to construct it via MCP service discovery, see `build-system-prompt.py`.
29
29
This script will generate exactly the same system prompt as `reference-system-prompt.py`.
0 commit comments