You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/workers-ai/features/prompting.mdx
+12-2Lines changed: 12 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -56,8 +56,18 @@ Typically, the role can be one of three options:
56
56
use them to set rules and how you expect the AI to behave.
57
57
- <strong>user</strong> - User messages are where you actually query the AI by
58
58
providing a question or a conversation.
59
-
- <strong>assistant</strong> - Assistant messages hint to the AI about the
60
-
desired output format. Not all models support this role.
59
+
-**assistant** – Assistant messages can be used to guide the AI toward a desired output format
60
+
(for example, enforcing JSON structure or Markdown formatting).
61
+
62
+
⚠️ **Model support:** Not all Workers AI text-generation models currently support the `assistant` role.
63
+
Models based on chat-style templates (such as LLaMA 2 Chat or Mistral-Instruct) are more likely to respect it,
64
+
while some base instruct models may ignore these messages.
65
+
66
+
If you include `assistant` in your prompt with a model that doesn’t support it, the message will simply be ignored.
67
+
68
+
✅ **Recommendation:** For best results, use **scoped prompts**, which provide a unified interface across models.
69
+
This way, your application logic doesn’t depend on which roles each model implements.
70
+
61
71
62
72
OpenAI has a [good explanation](https://platform.openai.com/docs/guides/text-generation#messages-and-roles) of how they use these roles with their GPT models. Even though chat templates are flexible, other text generation models tend to follow the same conventions.
0 commit comments