Live coding a streaming ChatGPT proxy with Swift OpenAPI—from scratch!
- Track: Swift
- Room: K.4.401
- Day: Saturday
- Start: 16:20
- End: 16:40
- Video only: k4401
- Chat: Join the conversation!
Join us as we build a ChatGPT client, from scratch, using Swift OpenAPI Generator. We’ll take advantage of Swift OpenAPI’s pluggable HTTP transports to reuse the same generated client to make upstream calls from a Linux server, providing end-to-end streaming, backed by async sequences, without buffering upstream responses.
In this session you’ll learn how to:
- Generate a type-safe ChatGPT macOS client and use URLSession OpenAPI transport.
- Stream LLM responses using Server Sent Events (SSE).
- Bootstrap a Linux proxy server using the Vapor OpenAPI transport.
- Use the same generated ChatGPT client within the proxy by switching to the AsyncHTTPClient transport.
- Efficiently transform responses from SSE to JSON Lines, maintaining end-to-end streaming.
Speakers
Si Beaumont | |
Honza Dvorsky |