Skip to content

Commit 361e0d3

Browse files
Merge pull request #281581 from ssalgadodev/patch-131
Update concept-model-distillation.md
2 parents 739cce3 + c862f02 commit 361e0d3

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/ai-studio/concepts/concept-model-distillation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ In Azure AI Studio, you can leverage Distillation to efficiently train the stude
2323

2424
## Distillation
2525

26-
In machine learning, distillation is a technique used to transfer knowledge from a large, complex model (often called the “teacher model”) to a smaller, simpler model (the “student model”). This process helps the smaller model achieve similar performance to the larger one while being more efficient in terms of computation and memory usage12.
26+
In machine learning, distillation is a technique used to transfer knowledge from a large, complex model (often called the “teacher model”) to a smaller, simpler model (the “student model”). This process helps the smaller model achieve similar performance to the larger one while being more efficient in terms of computation and memory usage.
2727

2828
The main steps in knowledge distillation involve:
2929

0 commit comments

Comments
 (0)