Abstract: Knowledge distillation (KD) is a model compression technique that transfers knowledge from a complex and well-trained teacher model to a compact student model, thereby enabling the student ...
Abstract: Black-box unsupervised domain adaptation (BBUDA) is a challenging task that transfers knowledge from the source domain to the target domain without access to the source data and source model ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results