Skip to content

apeterswu/L2T_loss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Learning to Teach with Dynamic Loss Functions

This repo contains the simple demo for the NeurIPS-18 paper: Learning to Teach with Dynamic Loss Functions.

@inproceedings{wu2018learning,
  title={Learning to teach with dynamic loss functions},
  author={Wu, Lijun and Tian, Fei and Xia, Yingce and Fan, Yang and Qin, Tao and Jian-Huang, Lai and Liu, Tie-Yan},
  booktitle={Advances in Neural Information Processing Systems},
  pages={6466--6477},
  year={2018}
}

Description

  • Please note this is only a simple demo for the Mnist experiments based on Lenet.
  • Please note the algorithm in the demo is little different to the paper, but the main spirit is same.
  • The code is based on the Theano framework, which is somehow too old to directly apply this code.

Detailed Critical Codes

Refer to loss_lenet_light_dynamic.py for the detailed demo codes. The general comments are here:

The 'reverse model training' defines the updates of the teacher model, and the last one is the detailed reverse model update chains. 

About

simple demo codes for Learning to Teach with Dynamic Loss Functions

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages