Abstract: Knowledge distillation (KD) is a model compression technique that transfers knowledge from a complex and well-trained teacher model to a compact student model, thereby enabling the student ...
Abstract: Black-box unsupervised domain adaptation (BBUDA) is a challenging task that transfers knowledge from the source domain to the target domain without access to the source data and source model ...