diff --git a/README.md b/README.md index 1796d38d..a56ef989 100644 --- a/README.md +++ b/README.md @@ -325,22 +325,22 @@ options: We support [codex](https://github.com/openai/codex) as a client for gpt-oss. To run the 20b version, set this to `~/.codex/config.toml`: -``` +```toml disable_response_storage = true show_reasoning_content = true [model_providers.local] name = "local" -base_url = "http://localhost:11434/v1" +base_url = "http://localhost:8000/v1" [profiles.oss] model = "gpt-oss:20b" model_provider = "local" ``` -This will work with any chat completions-API compatible server listening on port 11434, like ollama. Start the server and point codex to the oss model: +This will work with any chat completions-API compatible server listening on port 8000. If you're running the UI locally, it typically serves on http://localhost:8081. Start the server and point codex to the oss model: -``` +```bash ollama run gpt-oss:20b codex -p oss ``` diff --git a/examples/README.md b/examples/README.md new file mode 100644 index 00000000..f81ab09a --- /dev/null +++ b/examples/README.md @@ -0,0 +1,78 @@ +# gpt-oss Examples + +This directory contains various examples demonstrating how to use gpt-oss in different scenarios and with different frameworks. + +## Available Examples + +### [Streamlit Chat](./streamlit/) +A simple chat interface built with Streamlit that connects to a local gpt-oss server. + +**Features:** +- Interactive web-based chat interface +- Real-time responses +- Easy to customize and extend + +### [Agents SDK - Python](./agents-sdk-python/) +Example using the OpenAI Agents SDK with Python to create an intelligent agent that can use tools and MCP servers. + +**Features:** +- Tool integration (weather example) +- MCP server connectivity for filesystem operations +- Streaming responses +- Async/await support + +### [Agents SDK - JavaScript/TypeScript](./agents-sdk-js/) +TypeScript example using the OpenAI Agents SDK to create an intelligent agent with tool calling capabilities. + +**Features:** +- Tool integration +- MCP server connectivity +- TypeScript support +- Modern async/await patterns + +### [Gradio Chat](./gradio/) +A simple chat interface using Gradio framework. + +**Features:** +- Quick setup with Gradio +- Web-based interface +- Easy deployment + +## Getting Started + +1. **Start a local gpt-oss server** on `http://localhost:8000` +2. **Choose an example** from the directories above +3. **Follow the README** in each example directory for specific setup instructions + +## Prerequisites + +- Python 3.12+ +- A running gpt-oss server (see main README for setup instructions) +- Framework-specific dependencies (listed in each example's README) + +## Common Setup + +Most examples assume you have a local gpt-oss server running. You can start one using: + +```bash +# Using the responses API server +python -m gpt_oss.responses_api.serve --checkpoint gpt-oss-20b/original/ --port 8000 + +# Or using vLLM +vllm serve openai/gpt-oss-20b --port 8000 + +# Or using Ollama +ollama serve +ollama run gpt-oss:20b +``` + +If you're running the UI locally, it typically serves on `http://localhost:8081`. + +## Contributing + +When adding new examples: +1. Create a new directory with a descriptive name +2. Include a comprehensive README.md with setup instructions +3. Ensure all dependencies are clearly listed +4. Test the example thoroughly +5. Update this main examples README to include your new example diff --git a/examples/agents-sdk-python/README.md b/examples/agents-sdk-python/README.md new file mode 100644 index 00000000..8d42b2db --- /dev/null +++ b/examples/agents-sdk-python/README.md @@ -0,0 +1,123 @@ +# Agents SDK Python Example + +This example demonstrates how to use the OpenAI Agents SDK with Python to create an intelligent agent that can interact with tools and MCP servers. + +## Prerequisites + +- Python 3.12+ +- Node.js and npm (for MCP server) +- A running gpt-oss server + +## Installation + +1. Install Python dependencies: + +```bash +pip install openai agents +``` + +2. Install Node.js dependencies for MCP server: + +```bash +npm install -g npx +``` + +## Configuration + +The example is configured to connect to a local gpt-oss server. Update the configuration in `example.py` if needed: + +```python +openai_client = AsyncOpenAI( + api_key="local", + base_url="http://localhost:8000/v1", +) +``` + +## Running the Example + +1. Start your local gpt-oss server on `http://localhost:8000` + +2. Run the Python example: + +```bash +python example.py +``` + +3. Enter your message when prompted and interact with the agent + +## Features + +### Tool Integration +The example includes a simple weather tool that demonstrates how to integrate custom functions: + +```python +@function_tool +async def get_weather(location: str) -> str: + return f"The weather in {location} is sunny." +``` + +### MCP Server Integration +The agent connects to a filesystem MCP server that allows it to: +- Read files +- Write files +- List directories +- Navigate the filesystem + +### Streaming Responses +The example demonstrates how to handle streaming responses and different event types: +- Tool calls +- Tool outputs +- Message outputs +- Agent updates + +## Customization + +### Adding New Tools +You can add new tools by defining functions with the `@function_tool` decorator: + +```python +@function_tool +async def my_custom_tool(param: str) -> str: + # Your tool logic here + return "Tool result" +``` + +### Different Models +Change the model by updating the agent configuration: + +```python +agent = Agent( + name="My Agent", + instructions="You are a helpful assistant.", + tools=[get_weather], + model="gpt-oss:120b", # or other available models + mcp_servers=[mcp_server], +) +``` + +### Custom MCP Servers +You can connect to different MCP servers by modifying the server configuration: + +```python +mcp_server = MCPServerStdio( + name="Custom MCP Server", + params={ + "command": "your-mcp-server-command", + "args": ["arg1", "arg2"], + }, +) +``` + +## Troubleshooting + +### npx not found +If you get an error about npx not being found: +```bash +npm install -g npx +``` + +### Connection errors +Ensure your gpt-oss server is running on the correct port (8000) and accessible. + +### MCP server issues +The filesystem MCP server requires npx to be installed and accessible in your PATH. diff --git a/examples/streamlit/README.md b/examples/streamlit/README.md new file mode 100644 index 00000000..4162854f --- /dev/null +++ b/examples/streamlit/README.md @@ -0,0 +1,56 @@ +# Streamlit Chat Example + +This example demonstrates how to create a simple chat interface using Streamlit and gpt-oss. + +## Prerequisites + +- Python 3.12+ +- A running gpt-oss server +- Streamlit installed + +## Installation + +1. Install dependencies: + +```bash +pip install streamlit openai +``` + +2. Ensure you have a local gpt-oss server running + +## Running the Example + +1. Start your local gpt-oss server on `http://localhost:8000` (or modify the base URL in the code) + +2. Run the Streamlit application: + +```bash +streamlit run streamlit_chat.py +``` + +3. Open your browser to the URL shown in the terminal (typically `http://localhost:8501`) + +## Configuration + +You can modify the base URL and other settings by editing the configuration in `streamlit_chat.py`: + +```python +client = OpenAI( + api_key="local", + base_url="http://localhost:8000/v1", +) +``` + +## Features + +- Interactive chat interface +- Real-time responses from gpt-oss +- Simple and clean UI using Streamlit + +## Customization + +Feel free to modify the interface and add additional features such as: +- Chat history persistence +- Different model configurations +- Custom styling +- Tool integration