Skip to content

Getting "final prompt starts with 2 BOS tokens" warning with DeepSeek V3.1ย #7500

@Lissanro

Description

@Lissanro

App Version

3.26.2

API Provider

OpenAI Compatible

Model Used

DeepSeek V3.1

Roo Code Task Links (Optional)

No response

๐Ÿ” Steps to Reproduce

Load DeepSeek V3.1 with --jinja enabled in ik_llama.cpp, I get this warning on every prompt:

check_double_bos_eos: Added a BOS token to the prompt as specified by the model but the prompt also starts with a BOS token. So now the final prompt starts with 2 BOS tokens. Are you sure this is what you want?

Looks like Roo is adding BOS token in the prompt?

https://docs.unsloth.ai/basics/deepseek-v3.1-how-to-run-locally article mentions "A BOS is forcibly added ... For llama.cpp / GGUF inference, you should skip the BOS since itโ€™ll auto add it."

๐Ÿ’ฅ Outcome Summary

There should not be duplicate BOS token since this potentially can reduce quality of output by the model.

๐Ÿ“„ Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions