| 
 | 1 | +# Streaming ChatGPT Proxy  | 
 | 2 | + | 
 | 3 | +An example project using [Swift OpenAPI Generator](https://github.com/apple/swift-openapi-generator).  | 
 | 4 | + | 
 | 5 | +> **Disclaimer:** This example is deliberately simplified and is intended for illustrative purposes only.  | 
 | 6 | +
  | 
 | 7 | +## Overview  | 
 | 8 | + | 
 | 9 | +A tailored API server, backed by ChatGPT, and client CLI, with end-to-end  | 
 | 10 | +streaming.  | 
 | 11 | + | 
 | 12 | +This package is the reference sources for the demo presented at [FOSDEM 2025:  | 
 | 13 | +_Live coding a streaming ChatGPT proxy with Swift OpenAPI—from  | 
 | 14 | +scratch!_][fosdem25-swift-openapi]  | 
 | 15 | + | 
 | 16 | +> Join us as we build a ChatGPT client, from scratch, using Swift OpenAPI Generator. We’ll take advantage of Swift OpenAPI’s pluggable HTTP transports to reuse the same generated client to make upstream calls from a Linux server, providing end-to-end streaming, backed by async sequences, without buffering upstream responses.  | 
 | 17 | +>  | 
 | 18 | +> In this session you’ll learn how to:  | 
 | 19 | +>  | 
 | 20 | +> * Generate a type-safe ChatGPT macOS client and use URLSession OpenAPI transport.  | 
 | 21 | +> * Stream LLM responses using Server Sent Events (SSE).  | 
 | 22 | +> * Bootstrap a Linux proxy server using the Vapor OpenAPI transport.  | 
 | 23 | +> * Use the same generated ChatGPT client within the proxy by switching to the AsyncHTTPClient transport.  | 
 | 24 | +> * Efficiently transform responses from SSE to JSON Lines, maintaining end-to-end streaming.  | 
 | 25 | +
  | 
 | 26 | +The example provides an API for a fictitious _ChantGPT_ service, which produces  | 
 | 27 | +creative chants to sing at basketball games. 🙌 🏀 🙌  | 
 | 28 | + | 
 | 29 | +## Usage  | 
 | 30 | + | 
 | 31 | +The upstream calls to ChatGPT require an API token, which is configured using the `OPENAI_TOKEN` environment variable.  | 
 | 32 | +Rename `.env.example` to `.env` and replace the placeholder with your token.  | 
 | 33 | + | 
 | 34 | +Build and run the server using:  | 
 | 35 | + | 
 | 36 | +```console  | 
 | 37 | +% swift run ProxyServer  | 
 | 38 | +2025-01-30T09:12:23+0000 notice codes.vapor.application : [Vapor] Server starting on http://127.0.0.1:8080  | 
 | 39 | +...  | 
 | 40 | +```  | 
 | 41 | + | 
 | 42 | +Then, from another terminal, run the proxy client using:  | 
 | 43 | + | 
 | 44 | +```console  | 
 | 45 | +% swift run ClientCLI "That team with the Bull logo"  | 
 | 46 | +Build of product 'ClientCLI' complete! (7.24s)  | 
 | 47 | +🧑💼: That one with the bull logo  | 
 | 48 | +---  | 
 | 49 | +🤖: **"Charge Ahead, Chicago Bulls!"**  | 
 | 50 | + | 
 | 51 | +(Verse 1)  | 
 | 52 | +Red and black, we’re on the prowl,  | 
 | 53 | +Chicago Bulls, hear us growl!  | 
 | 54 | +From the Windy City, we take the lead,  | 
 | 55 | +Charging forward with lightning speed!  | 
 | 56 | + | 
 | 57 | +(Chorus)  | 
 | 58 | +B-U-L-L-S, Bulls! Bulls! Bulls!  | 
 | 59 | +We’re the team that never dulls!  | 
 | 60 | +Hoops and hustle, heart and soul,  | 
 | 61 | +Chicago Bulls, we’re on a roll!  | 
 | 62 | +...  | 
 | 63 | +```  | 
 | 64 | + | 
 | 65 | +## Linux development with VS Code Dev Containers  | 
 | 66 | + | 
 | 67 | +The package also contains configuration for developing with VS Code [Dev  | 
 | 68 | +Containers][dev-containers].  | 
 | 69 | + | 
 | 70 | +If you have the Dev Containers extension installed, use the `Dev Containers: Reopen in Container` command to switch to build and run for Linux.  | 
 | 71 | + | 
 | 72 | +[fosdem25-swift-openapi]: https://fosdem.org/2025/schedule/event/fosdem-2025-5230-live-coding-a-streaming-chatgpt-proxy-with-swift-openapi-from-scratch-/  | 
 | 73 | +[dev-containers]: https://code.visualstudio.com/docs/devcontainers/containers  | 
0 commit comments