Enhancement: Add one time system prompt setting, different to prompt_prefix #5279
Replies: 3 comments
-
Thanks for using LibreChat! It is customary for system instructions to be included with the accumulating chat history on every run. If not, the instructions will not be "remembered" on subsequent messages. You pointed to the system prompts Anthropic uses for Claude.ai: I'm fairly certain these get included in the context window every time; otherwise, they would be forgotten, and would not ensure the behavior they seek the LLM to exhibit in the conversation. LLMs are inherently stateless - each inference is independent and doesn't maintain memory of previous interactions. Because of this, everything that needs to influence the model's response (including system instructions and chat history) must be present in the context window during each inference. This is why system instructions need to be included with every prompt, typically prepended through mechanisms like promptPrefix. |
Beta Was this translation helpful? Give feedback.
-
For one-time messages added at any point, you can make use of "prompts" proper: |
Beta Was this translation helpful? Give feedback.
-
Appreciate the fast and detailed answer! That makes sense. I knew about the stateless nature of LLMs, i just thought that the API would also send a history of messages and that one could save tokens and avoid repetition by adding a system message at the beginning of the conversation. But I guess when the context length is reached, this prompt would then be forgotten. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What features would you like to see added?
First of all thanks for developing LibreChat :)
Similar to what Anthropic does, I would like to be able to start an interaction with a LLM with a one time system prompt.
More details
This is different to prompt_prefix, as this is added to each prompt. This is not ideal for a system prompt like the one linked above.
We could add a setting system_prompt to model_specs that will start the chat with a message containing this prompt.
Which components are impacted by your request?
Endpoints
Pictures
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions