Right now, when creating a new ChatGPT::Client, the model/LLM version being used seems fixed or implicit. For some use cases, it would be useful to explicitly specify which version of the model (e.g. gpt-4o-mini, gpt-4.1, gpt-3.5-turbo, etc.) the client should talk to.
Maybe something like
client = ChatGPT::Client.new(
api_key: ENV["OPENAI_API_KEY"],
model: "gpt-4o-mini"
)
The use case being I want to use specific models for different prompts for cost savings, etc.
Thanks, awesome gem!!