Skip to content

Commit 290e515

Browse files
Update articles/ai-studio/concepts/concept-model-distillation.md
Co-authored-by: Samantha Salgado <[email protected]>
1 parent c84e052 commit 290e515

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

articles/ai-studio/concepts/concept-model-distillation.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,8 @@ The main steps in knowledge distillation are:
3131

3232
## Sample notebook
3333

34-
You can use the [sample notebook](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/system/distillation/README.md) to see how to perform distillation. In this sample notebook, the teacher model uses the Meta Llama 3.1 405B instruction model, and the student model uses the Meta Llama 3.1 8B instruction model.
34+
You can use the [sample notebook](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/system/distillation) to see how to perform distillation. In this sample notebook, the teacher model uses the Meta Llama 3.1 405B instruction model, and the student model uses the Meta Llama 3.1 8B instruction model.
35+
3536

3637
We used an advanced prompt during synthetic data generation. The advanced prompt incorporates chain-of-thought (CoT) reasoning, which results in higher-accuracy data labels in the synthetic data. This labeling further improves the accuracy of the distilled model.
3738

0 commit comments

Comments
 (0)