diff --git a/src/content/docs/workers-ai/guides/prompting.mdx b/src/content/docs/workers-ai/guides/prompting.mdx index 09c51087c7228ee..4656d9d3c28caaa 100644 --- a/src/content/docs/workers-ai/guides/prompting.mdx +++ b/src/content/docs/workers-ai/guides/prompting.mdx @@ -57,7 +57,7 @@ Typically, the role can be one of three options: - assistant - Assistant messages hint to the AI about the desired output format. Not all models support this role. -OpenAI has a [good explanation](https://docs.airops.com/docs/llm-step#openai-chat-model-specifications) of how they use these roles with their GPT models. Even though chat templates are flexible, other text generation models tend to follow the same conventions. +OpenAI has a [good explanation](/workers-ai/models/llama-2-13b-chat-awq) of how they use these roles with their GPT models. Even though chat templates are flexible, other text generation models tend to follow the same conventions. Here's an input example of a scoped prompt using system and user roles: