[Bug]: Think tags still displayed, not properly rendered. #8409
Unanswered
oneshot0011
asked this question in
Troubleshooting
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
#5678 (comment)
I encountered a very similar issue to the above. One Nvidia Nim thinking model doesn't properly show thinking block. Pic 1 is not normal.
Version Information
above
Steps to Reproduce
Here is the code from official nvidia nim page.
from openai import OpenAI
client = OpenAI(
base_url = "https://integrate.api.nvidia.com/v1",
api_key = "$API_KEY_REQUIRED_IF_EXECUTING_OUTSIDE_NGC"
)
completion = client.chat.completions.create(
model="nvidia/llama-3.1-nemotron-ultra-253b-v1",
messages=[{"role":"system","content":"detailed thinking on"}],
temperature=0.6,
top_p=0.95,
max_tokens=4096,
frequency_penalty=0,
presence_penalty=0,
stream=True
)
for chunk in completion:
if chunk.choices[0].delta.content is not None:
print(chunk.choices[0].delta.content, end="")
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions