Skip to content

dtuzi123/Lifelong-infinite-mixture-model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lifelong infinite mixture model

📋 This is the implementation of the Lifelong infinite mixture model

📋 Accepted by ICCV 2021

Title : Lifelong Infinite Mixture Model Based on Knowledge-Driven Dirichlet Process

Paper link

https://arxiv.org/abs/2108.12278

Abstract

Recent research efforts in lifelong learning propose to grow a mixture or use an ensemble structure to adapt the deep model to learning a growing number of tasks. The proposed methodology shows promising results in overcoming catastrophic forgetting. However, the theory behind these successful models is still not well understood. In this paper, we perform the theoretical analysis for lifelong learning models by deriving the risk bounds based on the discrepancy distance between the probabilistic representation of data generated by the model and that corresponding to the target set. Inspired by the theoretical analysis, we introduce a new lifelong learning approach, namely the Lifelong Infinite Mixture model (LIMix), which can automatically expand its network architectures or choose an appropriate component to adapt its parameters for learning a new task, while preserving its previously learnt information. We propose to incorporate the knowledge into the Dirichlet process by using a gating mechanism which computes the dependence between the knowledge learnt previously and stored in each component and a new set of data, benefiting the accuracy and efficiency of the selection and expansion for LIMix. Besides, we exploit to train a compact Student model which can accumulate cross-domain representations over time and make quick inferences.

Environment

  1. Tensorflow 2.1
  2. Python 3.6

Training and evaluation

📋 We provide an easy way to train and evaluate the performance of the model.

📋 Python xxx.py, the model will be automatically trained and then report the results after the training.

📋 Different parameter settings of LMix would lead different results and we also provide different settings used in our experiments.

BibTex

📋 If you use our code, please cite our paper as:

@misc{ye2021lifelong, title={Lifelong Infinite Mixture Model Based on Knowledge-Driven Dirichlet Process}, author={Fei Ye and Adrian G. Bors}, year={2021}, eprint={2108.12278}, archivePrefix={arXiv}, primaryClass={cs.LG} }

About

The implementation of the lifelong infinite mixture model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages