-
Notifications
You must be signed in to change notification settings - Fork 2
add support for OpenAI models #35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds support for OpenAI models within the Swift Bedrock Library, enabling both text completion and conversation capabilities with OpenAI's GPT models through AWS Bedrock.
- Implements OpenAI model support with request/response handling for GPT OSS 20b and 120b variants
- Enhances TextCompletion with reasoning extraction to parse
<reasoning>tags from model outputs - Adds comprehensive test coverage and example applications demonstrating both invoke and converse patterns
Reviewed Changes
Copilot reviewed 15 out of 15 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
Sources/Models/OpenAI/*.swift |
Core OpenAI model implementation with request/response structures and parameter definitions |
Sources/InvokeModel/TextCompletion.swift |
Enhanced with reasoning extraction capability and Sendable conformance |
Sources/InvokeModel/InvokeModelResponse.swift |
Added logging parameter for response creation methods |
Sources/BedrockModel.swift |
Registered new OpenAI model variants in model initialization |
Tests/InvokeModel/TextCompletionTests.swift |
Comprehensive test coverage for reasoning extraction functionality |
Examples/openai/* |
Complete example applications demonstrating OpenAI model usage |
| self.top_p = topP | ||
| } | ||
|
|
||
| private struct OpenAIMessage: Codable { |
Copilot
AI
Aug 8, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Role type is referenced but not defined in this file. This creates an incomplete API definition that could cause compilation errors.
| if topP != nil && temperature != nil { | ||
| throw BedrockLibraryError.notSupported("Alter either topP or temperature, but not both.") | ||
| } |
Copilot
AI
Aug 8, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The validation logic prevents using both topP and temperature simultaneously, but this restriction may not align with OpenAI's actual API capabilities. Consider verifying this constraint against OpenAI's documentation.
| if topP != nil && temperature != nil { | |
| throw BedrockLibraryError.notSupported("Alter either topP or temperature, but not both.") | |
| } | |
| // Both topP and temperature can be set simultaneously per OpenAI API documentation. |
| return OpenAIRequestBody( | ||
| prompt: prompt, | ||
| maxTokens: maxTokens, | ||
| temperature: temperature ?? parameters.temperature.defaultValue, | ||
| topP: topP ?? parameters.topP.defaultValue, |
Copilot
AI
Aug 8, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If both topP and temperature are nil, the code will use the default temperature value even when topP should be used instead. This could lead to unexpected parameter combinations.
| return OpenAIRequestBody( | |
| prompt: prompt, | |
| maxTokens: maxTokens, | |
| temperature: temperature ?? parameters.temperature.defaultValue, | |
| topP: topP ?? parameters.topP.defaultValue, | |
| // If both are nil, prefer topP and set temperature to nil | |
| let resolvedTemperature: Double? | |
| let resolvedTopP: Double? | |
| if let t = temperature { | |
| resolvedTemperature = t | |
| resolvedTopP = nil | |
| } else if let p = topP { | |
| resolvedTemperature = nil | |
| resolvedTopP = p | |
| } else { | |
| // Both are nil: prefer topP default, temperature nil | |
| resolvedTemperature = nil | |
| resolvedTopP = parameters.topP.defaultValue | |
| } | |
| return OpenAIRequestBody( | |
| prompt: prompt, | |
| maxTokens: maxTokens, | |
| temperature: resolvedTemperature, | |
| topP: resolvedTopP, |
Co-authored-by: Copilot <[email protected]>
Add support for OpenAI Models + a converse and a text completion example