Skip to content

Commit 6fde963

Browse files
Merge pull request #884 from PhilKang0704/broken-link-fix-ssalgadodev
Broken links fix - ssalgadodev
2 parents 10b3913 + 290e515 commit 6fde963

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

articles/ai-studio/concepts/concept-model-distillation.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,8 @@ The main steps in knowledge distillation are:
3131

3232
## Sample notebook
3333

34-
You can use the [sample notebook](https://aka.ms/meta-llama-3.1-distillation) to see how to perform distillation. In this sample notebook, the teacher model uses the Meta Llama 3.1 405B instruction model, and the student model uses the Meta Llama 3.1 8B instruction model.
34+
You can use the [sample notebook](https://github.com/Azure/azureml-examples/tree/main/sdk/python/foundation-models/system/distillation) to see how to perform distillation. In this sample notebook, the teacher model uses the Meta Llama 3.1 405B instruction model, and the student model uses the Meta Llama 3.1 8B instruction model.
35+
3536

3637
We used an advanced prompt during synthetic data generation. The advanced prompt incorporates chain-of-thought (CoT) reasoning, which results in higher-accuracy data labels in the synthetic data. This labeling further improves the accuracy of the distilled model.
3738

0 commit comments

Comments
 (0)