You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-use-foundation-models.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ You can filter the list of models in the model catalog by Task, or by license. S
38
38
> [!NOTE]
39
39
>Models from Hugging Face are subject to third party license terms available on the Hugging Face model details page. It is your responsibility to comply with the model's license terms.
40
40
41
-
You can quickly test out any pre-trained model using the Sample Inference widget on the model card, providing your own sample input to test the result. Additionally, the model card for each model includes a brief description of the model and links to samples for code based inferencing, finetuning and evaluation of the model.
41
+
You can quickly test out any pre-trained model using the Sample Inference widget on the model card, providing your own sample input to test the result. Additionally, the model card for each model includes a brief description of the model and links to samples for code based inferencing, fine-tuning and evaluation of the model.
42
42
43
43
## How to evaluate foundation models using your own test data
44
44
@@ -63,7 +63,7 @@ Each model can be evaluated for the specific inference task that the model can b
63
63
64
64
**Compute:**
65
65
66
-
1. Provide the Azure Machine Learning Compute cluster you would like to use for finetuning the model. Evaluation needs to run on GPU compute. Ensure that you have sufficient compute quota for the compute SKUs you wish to use.
66
+
1. Provide the Azure Machine Learning Compute cluster you would like to use for fine-tuning the model. Evaluation needs to run on GPU compute. Ensure that you have sufficient compute quota for the compute SKUs you wish to use.
67
67
68
68
1. Select **Finish** in the Evaluate wizard to submit your evaluation job. Once the job completes, you can view evaluation metrics for the model. Based on the evaluation metrics, you might decide if you would like to finetune the model using your own training data. Additionally, you can decide if you would like to register the model and deploy it to an endpoint.
69
69
@@ -84,7 +84,7 @@ You can invoke the finetune settings form by selecting on the **Finetune** butto
84
84
:::image type="content" source="./media/how-to-use-foundation-models/finetune-quick-wizard.png" alt-text="Screenshot showing the finetune settings options in the foundation models finetune settings form.":::
85
85
86
86
87
-
**Finetuning task type**
87
+
**Fine-tuning task type**
88
88
89
89
* Every pre-trained model from the model catalog can be finetuned for a specific set of tasks (For Example: Text classification, Token classification, Question answering). Select the task you would like to use from the drop-down.
90
90
@@ -99,21 +99,21 @@ You can invoke the finetune settings form by selecting on the **Finetune** butto
99
99
100
100
* Validation data: Pass in the data you would like to use to validate your model. Selecting **Automatic split** reserves an automatic split of training data for validation. Alternatively, you can provide a different validation dataset.
101
101
* Test data: Pass in the test data you would like to use to evaluate your finetuned model. Selecting **Automatic split** reserves an automatic split of training data for test.
102
-
* Compute: Provide the Azure Machine Learning Compute cluster you would like to use for finetuning the model. Finetuning needs to run on GPU compute. We recommend using compute SKUs with A100 / V100 GPUs when fine tuning. Ensure that you have sufficient compute quota for the compute SKUs you wish to use.
102
+
* Compute: Provide the Azure Machine Learning Compute cluster you would like to use for fine-tuning the model. Fine-tuning needs to run on GPU compute. We recommend using compute SKUs with A100 / V100 GPUs when fine tuning. Ensure that you have sufficient compute quota for the compute SKUs you wish to use.
103
103
104
-
3. Select **Finish** in the finetune form to submit your finetuning job. Once the job completes, you can view evaluation metrics for the finetuned model. You can then register the finetuned model output by the finetuning job and deploy this model to an endpoint for inferencing.
104
+
3. Select **Finish** in the finetune form to submit your fine-tuning job. Once the job completes, you can view evaluation metrics for the finetuned model. You can then register the finetuned model output by the fine-tuning job and deploy this model to an endpoint for inferencing.
105
105
106
-
### Finetuning using code based samples
106
+
### Fine-tuning using code based samples
107
107
108
-
Currently, Azure Machine Learning supports finetuning models for the following language tasks:
108
+
Currently, Azure Machine Learning supports fine-tuning models for the following language tasks:
109
109
110
110
* Text classification
111
111
* Token classification
112
112
* Question answering
113
113
* Summarization
114
114
* Translation
115
115
116
-
To enable users to quickly get started with finetuning, we have published samples (both Python notebooks and CLI examples) for each task in the [azureml-examples git repo Finetune samples](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/system/finetune). Each model card also links to Finetuning samples for supported finetuning tasks.
116
+
To enable users to quickly get started with fine-tuning, we have published samples (both Python notebooks and CLI examples) for each task in the [azureml-examples git repo Finetune samples](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/system/finetune). Each model card also links to fine-tuning samples for supported fine-tuning tasks.
117
117
118
118
## Deploying foundation models to endpoints for inferencing
119
119
@@ -134,7 +134,7 @@ Since the scoring script and environment are automatically included with the fou
134
134
135
135
:::image type="content" source="./media/how-to-use-foundation-models/deploy-options.png" alt-text="Screenshot showing the deploy options on the foundation model card after user selects the deploy button.":::
136
136
137
-
Curated models from the Azure Machine Learning are in MLflow format. If you are planning to deploy this models under an online endpoints without network connectivity, you need to package the model first. Packaging is not required for models in the HuggingFace format.
137
+
Curated models from the Azure Machine Learning are in MLflow format. If you are planning to deploy this models under an online endpoint without network connectivity, you need to package the model first. Packaging is not required for models in the HuggingFace format.
138
138
139
139
:::image type="content" source="./media/how-to-use-foundation-models/studio-deploy-package.png" alt-text="Screenshot showing the package option for model deployment.":::
0 commit comments