Skip to content

Conversation

iSevenDays
Copy link
Contributor

@iSevenDays iSevenDays commented Jul 26, 2025

  • I have read the contributing guidelines
  • Self-reported review complexity:
    • Low
    • Medium
    • High
    • [ ]
      The recent function call implementation changed streaming responses to always send empty content with diffs, which broke text completion streaming endpoints (like those used by mikupad) that need actual token content in each streaming chunk. This fix differentiates between OpenAI-compatible chat completion (which uses diffs) and text completion endpoints (which need actual content) using the existing slot.oaicompat flag.

@iSevenDays
Copy link
Contributor Author

The fix has been also verified by another person here #628 (comment)

@saood06
Copy link
Collaborator

saood06 commented Jul 27, 2025

The fix has been also verified by another person here #628 (comment)

And by me.

Copy link
Collaborator

@saood06 saood06 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested.

This restored functionality to the /completion endpoint

@saood06
Copy link
Collaborator

saood06 commented Jul 27, 2025

@ikawrakow

I've been very intentional in not pushing code into branches that are not mine (including main) without your approval as this is your repo, but I am making an exception in this case as this is a very minor change, that fixes a rather serious bug and you are out on vacation.

@saood06 saood06 merged commit d65c8ce into ikawrakow:main Jul 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants