Skip to content

Commit 369ea80

Browse files
authored
Merge pull request #926 from Vinkle-hzt/main
fix bistream extra token
2 parents 276cfa0 + 69518b2 commit 369ea80

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

cosyvoice/llm/llm.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -382,7 +382,10 @@ def inference_bistream(
382382
if text_cache.size(1) >= self.mix_ratio[0]:
383383
lm_input_text = text_cache[:, :self.mix_ratio[0]]
384384
logging.info('append {} text token'.format(lm_input_text.size(1)))
385-
lm_input = torch.concat([lm_input, lm_input_text], dim=1)
385+
if len(out_tokens) != 0 and out_tokens[-1] == self.speech_token_size + 2:
386+
lm_input = lm_input_text
387+
else:
388+
lm_input = torch.concat([lm_input, lm_input_text], dim=1)
386389
text_cache = text_cache[:, self.mix_ratio[0]:]
387390
else:
388391
logging.info('not enough text token to decode, wait for more')

0 commit comments

Comments
 (0)