You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/faq.yml
+16-2Lines changed: 16 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ metadata:
7
7
manager: nitinme
8
8
ms.service: azure-ai-openai
9
9
ms.topic: faq
10
-
ms.date: 11/15/2023
10
+
ms.date: 11/22/2023
11
11
ms.author: mbullwin
12
12
author: mrbullwinkle
13
13
title: Azure OpenAI Service frequently asked questions
@@ -25,7 +25,7 @@ sections:
25
25
- question: |
26
26
Does Azure OpenAI work with the latest Python library released by OpenAI (version>=1.0)?
27
27
answer: |
28
-
Azure OpenAI is supported by the latest release of the [OpenAI Python library (version>=1.0)](https://pypi.org/project/openai/). However, it is important to note migration of your codebase using `openai migrate` is not supported and will not work with code that targets Azure OpenAI.
28
+
Azure OpenAI is supported by the latest release of the [OpenAI Python library (version>=1.0)](https://pypi.org/project/openai/). However, it's important to note migration of your codebase using `openai migrate` is not supported and will not work with code that targets Azure OpenAI.
29
29
- question: |
30
30
I can't find GPT-4 Turbo Preview?
31
31
answer:
@@ -90,6 +90,20 @@ sections:
90
90
If you wanted to help a GPT based model to accurately respond to the question "what model are you running?", you would need to provide that information to the model through techniques like [prompt engineering of the model's system message](/azure/ai-services/openai/concepts/advanced-prompt-engineering?pivots=programming-language-chat-completions), [Retrieval Augmented Generation (RAG)](/azure/machine-learning/concept-retrieval-augmented-generation?view=azureml-api-2) which is the technique used by [Azure OpenAI on your data](/azure/ai-services/openai/concepts/use-your-data) where up-to-date information is injected to the system message at query time, or via [fine-tuning](/azure/ai-services/openai/how-to/fine-tuning?pivots=programming-language-studio) where you could fine-tune specific versions of the model to answer that question in a certain way based on model version.
91
91
92
92
To learn more about how GPT models are trained and work we recommend watching [Andrej Karpathy's talk from Build 2023 on the state of GPT](https://www.youtube.com/watch?v=bZQun8Y4L2A).
93
+
- question: |
94
+
I asked the model when it's knowledge cutoff is and it gave me a different answer than what is on the Azure OpenAI model's page. Why does this happen?
95
+
answer:
96
+
This is expected behavior. The models aren't able to answer questions about themselves. If you want to know when the knowledge cutoff for the model's training data is, consult the [models page](./concepts/models.md).
97
+
- question: |
98
+
I asked the model a question about something that happened recently before the knowledge cutoff and it got the answer wrong. Why does this happen?
99
+
answer: |
100
+
This is expected behavior. First there's no guarantee that every recent event that has occurred was part of the model's training data. And even when information was part of the training data, without using additional techniques like Retrieval Augmented Generation (RAG) to help ground the model's responses there's always a chance of ungrounded responses occurring. Both Azure OpenAI's [use your data feature](./concepts/use-your-data.md) and [Bing Chat](https://www.microsoft.com/edge/features/bing-chat?form=MT00D8) use Azure OpenAI models combined with Retrieval Augmented Generation to help further ground model responses.
101
+
102
+
The frequency that a given piece of information appeared in the training data can also impact the likelihood that the model will respond in a certain way.
103
+
104
+
Asking the latest GPT-4 Turbo Preview model about something that changed more recently like "Who is the prime minister of New Zealand?", is likely to result in the fabricated response `Jacinda Ardern`. However, asking the model "When did `Jacinda Ardern` step down as prime minister?" Tends to yield an accurate response which demonstrates training data knowledge going to at least January of 2023.
105
+
106
+
So while it is possible to probe the model with questions to guess its training data knowledge cutoff, the [model's page](./concepts/models.md) is the best place to check a model's knowledge cutoff.
93
107
- question: |
94
108
Where do I access pricing information for legacy models which are no longer available for new deployments?
0 commit comments