Logit Standardization in Knowledge Distillation
First Author
Institution1
Institution1 address
firstauthor@i1.org
Second Author
Institution2
First line of institution2 address
secondauthor@i2.org
|
|
|
,
|
|
|
,
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Update towards minimizing
Input: Transfer set
with samples of image-label pair
, Number of classes
, Base Temperature
, Teacher
, Student
, Loss
(e.g.,
divergence
)
Output: Trained student model
1
2foreach in do
3
,
4 ,
5
6
7
8
9
10 Update
towards minimizing
11 end foreach
Algorithm 1 -score logit standardization pre-process in knowledge distillation.