-
Hey, I am trying to use reasoning model with ruby_llm. For instance : chat = RubyLLM.chat(model: 'o4-mini-deep-research') But I get the error: Does ruby_llm works with reasoning model? Thanks :) |
Beta Was this translation helpful? Give feedback.
Answered by
crmne
Sep 8, 2025
Replies: 1 comment 3 replies
-
It works with reasoning models, but not with OpenAI's Responses API yet. We already have a PR open about that! |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
CPloscaru
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It works with reasoning models, but not with OpenAI's Responses API yet. We already have a PR open about that!