Skip to content

johnsk95/PT4AL

Repository files navigation

PWC

Update Note

  • We solved all problems. The issue is that the epoch of the rotation prediction task was supposed to run only 15 epochs, but it was written incorrectly as 120 epochs. Sorry for the inconvenience. [2023.01.02]
  • Add Cold Start Experiments
[solved problem]
We are redoing the CIFAR10 experiment.

The current reproduction result is the performance of 91 to 93.

We will re-tune the code again for stable performance in the near future.

The rest of the experiments confirmed that there was no problem with reproduction.

Sorry for the inconvenience.

Experiment Setting:

  • CIFAR10 (downloaded and saved in ./DATA
  • Rotation prediction for pretext task

Prerequisites:

Python >= 3.7

CUDA = 11.0

PyTorch = 1.7.1

numpy >= 1.16.0

Running the Code

To generate train and test dataset:

python make_data.py

To train the rotation predition task on the unlabeled set:

python rotation.py

To extract pretext task losses and create batches:

python make_batches.py

To evaluate on active learning task:

python main.py

To mask cold start experiments (random)

python main_random.py

image

To mask cold start experiments (PT4AL)

python main_pt4al.py

image

Citation

If you use our code in your research, or find our work helpful, please consider citing us with the bibtex below:

@inproceedings{yi2022using,
  title = {Using Self-Supervised Pretext Tasks for Active Learning},
  author = {Yi, John Seon Keun and Seo, Minseok and Park, Jongchan and Choi, Dong-Geol},
  booktitle = {Proc. ECCV},
  year = {2022},
}

About

Official PyTorch implementation of "PT4AL: Using Self-Supervised Pretext Tasks for Active Learning (ECCV2022)"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages