Skip to content

Commit 314f253

Browse files
authored
Merge branch 'main' into patch-1
2 parents 0ebd0b4 + 18c32d1 commit 314f253

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

advanced_source/usb_semisup_learn.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@
8181
# algorithm on dataset
8282
#
8383
# Note that a CUDA-enabled backend is required for training with the ``semilearn`` package.
84-
# See `Enabling CUDA in Google Colab <https://pytorch.org/tutorials/beginner/colab#using-cuda>`__ for instructions
84+
# See `Enabling CUDA in Google Colab <https://pytorch.org/tutorials/beginner/colab#enabling-cuda>`__ for instructions
8585
# on enabling CUDA in Google Colab.
8686
#
8787
import semilearn

beginner_source/knowledge_distillation_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -324,7 +324,7 @@ def train_knowledge_distillation(teacher, student, train_loader, epochs, learnin
324324
soft_prob = nn.functional.log_softmax(student_logits / T, dim=-1)
325325

326326
# Calculate the soft targets loss. Scaled by T**2 as suggested by the authors of the paper "Distilling the knowledge in a neural network"
327-
soft_targets_loss = -torch.sum(soft_targets * soft_prob) / soft_prob.size()[0] * (T**2)
327+
soft_targets_loss = torch.sum(soft_targets * (soft_targets.log() - soft_prob)) / soft_prob.size()[0] * (T**2)
328328

329329
# Calculate the true label loss
330330
label_loss = ce_loss(student_logits, labels)

0 commit comments

Comments
 (0)