Skip to content

Commit 25cf1a5

Browse files
authored
Merge pull request #1969 from MicrosoftDocs/main
12/12/2024 AM Publish
2 parents 5ded047 + a4639de commit 25cf1a5

File tree

2 files changed

+12
-7
lines changed

2 files changed

+12
-7
lines changed

articles/ai-services/openai/concepts/fine-tuning-considerations.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -66,9 +66,9 @@ Here’s an example: A customer wanted to use GPT-3.5-Turbo to turn natural lang
6666
**If you are ready for fine-tuning you:**
6767

6868
- Have clear examples on how you have approached the challenges in alternate approaches and what’s been tested as possible resolutions to improve performance.
69-
- You've identified shortcomings using a base model, such as inconsistent performance on edge cases, inability to fit enough few shot prompts in the context window to steer the model, high latency, etc.
69+
- Identified shortcomings using a base model, such as inconsistent performance on edge cases, inability to fit enough few shot prompts in the context window to steer the model, high latency, etc.
7070

71-
**Common signs you might not be ready for fine-tuning yet:**
71+
**Common signs you might not be ready for fine-tuning include:**
7272

7373
- Insufficient knowledge from the model or data source.
7474
- Inability to find the right data to serve the model.
@@ -86,9 +86,9 @@ Another important point is even with high quality data if your data isn't in the
8686

8787
**If you are ready for fine-tuning you:**
8888

89-
- Have identified a dataset for fine-tuning.
90-
- The dataset is in the appropriate format for training.
91-
- Some level of curation has been employed to ensure dataset quality.
89+
- Identified a dataset for fine-tuning.
90+
- Formatted the dataset appropriately for training.
91+
- Curated the dataset to ensure quality.
9292

9393
**Common signs you might not be ready for fine-tuning yet:**
9494

@@ -103,4 +103,4 @@ There isn’t a single right answer to this question, but you should have clearl
103103

104104
- Watch the [Azure AI Show episode: "To fine-tune or not to fine-tune, that is the question"](https://www.youtube.com/watch?v=0Jo-z-MFxJs)
105105
- Learn more about [Azure OpenAI fine-tuning](../how-to/fine-tuning.md)
106-
- Explore our [fine-tuning tutorial](../tutorials/fine-tune.md)
106+
- Explore our [fine-tuning tutorial](../tutorials/fine-tune.md)

articles/ai-services/openai/includes/fine-tuning-unified.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,12 @@ author: mrbullwinkle
1111
ms.author: mbullwin
1212
---
1313

14-
There are two unique fine-tuning experiences in Azure AI Foundry portal. Both allow you to fine-tune Azure OpenAI models, but only the Hub/Project view supports fine-tuning non Azure OpenAI models. If you are only using the Azure OpenAI fine-tuning experience which is available anytime you select a resource in a region where fine-tuning is supported.
14+
There are two unique fine-tuning experiences in the Azure AI Foundry portal:
15+
16+
* [Hub/Project view](https://ai.azure.com) - supports fine-tuning models from multiple providers including Azure OpenAI, Meta Llama, Microsoft Phi, etc.
17+
* [Azure OpenAI centric view](https://oai.azure.com) - only supports fine-tuning Azure OpenAI models, but has support for additional features like the [Weights & Biases (W&B) preview integration](../how-to/weights-and-biases-integration.md).
18+
19+
If you are only fine-tuning Azure OpenAI models, we recommend the Azure OpenAI centric fine-tuning experience which is available by navigating to [https://oai.azure.com](https://oai.azure.com).
1520

1621
# [Azure OpenAI](#tab/azure-openai)
1722

0 commit comments

Comments
 (0)