Skip to content

Topology Distillation for Recommender System (KDD'21)

License

Notifications You must be signed in to change notification settings

SeongKu-Kang/Topology_Distillation_KDD21

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Topology Distillation for Recommender System

This repository provides the source code of "Topology Distillation for Recommender System" accepted in KDD2021 as a research paper.

1. Overview

We develop a general topology distillation approach for Recommender System. The topology distillation guides the learning of the student model by the topological structure built upon the relational knowledge in representation space of the teacher model.

Concretely, we propose two topology distillation methods:

  1. Full Topology Distillation (FTD). FTD transfers the full topology, and it is used in the scenario where the student has enough capacity to learn all the teacher’s knowledge.
  2. Hierarchical Topology Distillation (HTD). HTD transfers the decomposed topology hierarchically, and it is adopted in the classical KD scenario where the student has a very limited capacity compared to the teacher.

TD

2. Main Results

  • When the capacity of the student model is highly limited, the student model learns best with HTD.

    TD1

  • As the capacity gap between the teacher model and student model decreases, the student model takes more benefits from FTD.

    TD2

3. Requirements

  • Python version: 3.6.10
  • Pytorch version: 1.5.0

4. How to Run

Please refer to 'Guide to using topology distillation.ipynb' file.