Skip to content

Add reverse prompting example #17

@davidrpugh

Description

@davidrpugh

llama-cli supports reverse prompting. Reverse prompts are a powerful way to create a chat-like experience with your model by pausing the text generation when specific text strings are encountered using the --reverse-prompt option.

  • -r PROMPT, --reverse-prompt PROMPT: Specify one or multiple reverse prompts to pause text generation and switch to interactive mode. For example, -r "User:" can be used to jump back into the conversation whenever it's the user's turn to speak. This helps create a more interactive and conversational experience.

The --in-prefix flag is used to add a prefix to your input, primarily, this is used to insert a space after the reverse prompt. Here's an example of how to use the --in-prefix flag in conjunction with the --reverse-prompt flag:

llama-cli -r "User:" --in-prefix " "

The --in-suffix flag is used to add a suffix after your input. This is useful for adding an "Assistant:" prompt after the user's input. It's added after the new-line character (\n) that's automatically added to the end of the user's input. Here's an example of how to use the --in-suffix flag in conjunction with the --reverse-prompt flag:

./llama-cli -r "User:" --in-prefix " " --in-suffix "Assistant:"

When --in-prefix or --in-suffix options are enabled the chat template ( --chat-template ) is disabled.

Need a good example use case for this functionality.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions