Skip to content

Commit c862f02

Browse files
authored
Update concept-model-distillation.md
1 parent 867a8ec commit c862f02

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/ai-studio/concepts/concept-model-distillation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ In Azure AI Studio, you can leverage Distillation to efficiently train the stude
2323

2424
## Distillation
2525

26-
In machine learning, distillation is a technique used to transfer knowledge from a large, complex model (often called the “teacher model”) to a smaller, simpler model (the “student model”). This process helps the smaller model achieve similar performance to the larger one while being more efficient in terms of computation and memory usage12.
26+
In machine learning, distillation is a technique used to transfer knowledge from a large, complex model (often called the “teacher model”) to a smaller, simpler model (the “student model”). This process helps the smaller model achieve similar performance to the larger one while being more efficient in terms of computation and memory usage.
2727

2828
The main steps in knowledge distillation involve:
2929

0 commit comments

Comments
 (0)