Skip to content

Katwinkl3/support other providers#24

Merged
kelsey-wong merged 9 commits intomainfrom
katwinkl3/support_other_providers
Jan 2, 2026
Merged

Katwinkl3/support other providers#24
kelsey-wong merged 9 commits intomainfrom
katwinkl3/support_other_providers

Conversation

@katwinkl3
Copy link
Contributor

Key Info

  • Implementation plan: link
  • Priority: normal

What changed?

What do you want the reviewer(s) to focus on?

  • For use_logprobs' change, im not sure if MODELS_WITH_LOGPROBS list is comprehensive, and if this logic should be added anywhere else in the pipeline.
  • Also, I noticed that there are 2 ways to include the model param, 1 in completion_params (from openai_args) which is used in ReferenceCompletionTemplate and 2 in ConfigInput (which seems like it should be used everywhere else). If a model is only added for 1, the other pipeline process will use the default gpt model. Is this the expected behavior?
  • I also tried using different models for 1 and 2 (both gemini), but the gpt default model was still used somewhere in the pipeline, but i havent looked into it

Checklist

  • Did you link the GitHub issue?
  • Did you follow deployment steps or bump the version if needed?
  • Did you add/update tests?
  • What QA did you do?
    • Tested on my own jupyter notebook

@kelsey-wong
Copy link
Collaborator

Also, I noticed that there are 2 ways to include the model param, 1 in completion_params (from openai_args) which is used in ReferenceCompletionTemplate and 2 in ConfigInput (which seems like it should be used everywhere else). If a model is only added for 1, the other pipeline process will use the default gpt model. Is this the expected behavior?

passing the model through ConfigInput is leftover from before I refactored to the openai params spec for the inference function, so I'll remove that. the model should only be set through the params, not the config input

@kelsey-wong kelsey-wong merged commit 54a545e into main Jan 2, 2026
5 of 9 checks passed
@kelsey-wong kelsey-wong deleted the katwinkl3/support_other_providers branch January 2, 2026 20:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants