-
Notifications
You must be signed in to change notification settings - Fork 4.8k
Add support for --carry-initial-prompt #3395
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Patches for Ruby are nice, though I'm not sure the essential changes and API are accepted. Let me point just a thing.
Changes applied - let me know what you think of this PR |
@ggerganov could I ask you to review this PR? |
any update on this? Happy to resolve the conflict in |
Hi, apologies for the long wait. I'm interested in adding this functionality, but I am having difficulty following the implemented logic for prepending the initial prompt. Would like to see this simplified in some way. I'll try to add some suggestions how to improve it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the main complexity comes from using a single prompt_past
vector in the whisper_state
which results in some convoluted logic for deduplicating and slicing the tokens.
I expect that the logic can become much simpler if you replace prompt_past
with 2 vectors: prompt_past0
and prompt_past1
. The full prompt is a concatenation of prompt_past0 + prompt_past1
. The prompt_past0
can be utilized to store some static prefix - i.e. the original prompt that is being carried.
That's a good point. I tried taking a bit further and simplifying it as much as I could - what do you think? |
Pushed PR fixes - let me know what you think |
I've included your suggestion, and made two more further simplification
|
I did another check of all the logic in here and I think I found another issue - we only want to run line 7569 outside the |
// update prompt_past1 | ||
prompt_past1.clear(); | ||
if (!params.carry_initial_prompt && !prompt.empty() && prompt.front() == whisper_token_prev(ctx)) { | ||
prompt_past1.insert(prompt_past1.end(), prompt.begin() + 1, prompt.end() - prompt_init.size()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
An alternative would be to preserve the original behaviour by using the fact that prompt.size() > prompt_past0.size()
:
// update prompt_past1 | |
prompt_past1.clear(); | |
if (!params.carry_initial_prompt && !prompt.empty() && prompt.front() == whisper_token_prev(ctx)) { | |
prompt_past1.insert(prompt_past1.end(), prompt.begin() + 1, prompt.end() - prompt_init.size()); | |
// update prompt_past1 | |
prompt_past1.clear(); | |
if (!prompt.empty() && prompt.front() == whisper_token_prev(ctx)) { | |
prompt_past1.insert(prompt_past1.end(), prompt.begin() + 1 + prompt_past0.size(), prompt.end() - prompt_init.size()); |
Though make sure to double-check this works as expected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that this assumes the entire prompt_past0
is always included in the prompt, but that's not guaranteed. For example, if max_ctx_half - 1 < prompt_past0.size()
, we only take a tail of prompt_past0
, not all of it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, good point. Should we truncate prompt_past0
upon initialization so that it does not exceed the max_ctx_half
?
Co-authored-by: Georgi Gerganov <[email protected]>
This PR is bringing over the
--carry-initial-prompt
flag from the python library (openai/whisper#2343)By default, an
--prompt
(initial prompt) is only used for the first decoding window; subsequent windows rely on the text generated so far for continuity. When you pass--carry-initial-prompt
, the initial prompt tokens are explicitly prepended to every internal decode window. This mirrors the Python reference implementation'scarry_initial_prompt
behavior and can help enforce custom vocabulary or style throughout long transcriptions. Trade‑off: it may slightly reduce the model's ability to adapt dynamically to newly generated context (can increase risk of repetitions if the prompt is long). If the combined size of the carried initial prompt and the rolling context exceeds half the model text context, the leftmost (oldest) part of the initial prompt is truncated to fit.