Skip to content

kgl-prml/Pixel-Level-Cycle-Association

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pixel-Level Cycle Association

This is the Pytorch implementation of our NeurIPS 2020 Oral paper Pixel-Level Cycle Association: A New Perspective for Domain Adaptive Semantic Segmentation.

Requirements

pip install -r ./requirements.txt

We test our codes with two NVIDIA Tesla V100 (32G) GPU cards.

Dataset

See experiments/data/

Pre-trained Model

Following general practice, our training starts from ResNet-101 backbone pretrained on ImageNet. Please download the weight file and put it under the model directory.

Training

For GTAV to CityScapes:

CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 --use_env ./tools/train.py --cfg ./experiment/config/g2c_train.yaml --exp_name g2c 

For SYNTHIA to CityScapes:

CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nproc_per_node=2 --use_env ./tools/train.py --cfg ./experiment/config/s2c_train.yaml --exp_name s2c 

You can also use the shell scripts provided under directory experiment/scripts/train.sh to train your model.

Test

For GTAV to CityScapes:

CUDA_VISIBLE_DEVICES=0,1 python ./tools/test.py --cfg ./experiment/config/g2c_test.yaml --weights ${PATH_TRAINED_WEIGHTS} --exp_name g2c_test

For SYNTHIA to CityScapes:

CUDA_VISIBLE_DEVICES=0,1 python ./tools/test.py --cfg ./experiment/config/s2c_test.yaml --weights ${PATH_TRAINED_WEIGHTS} --exp_name s2c_test

You can also use the shell scripts provided under directory experiment/scripts/test_normal.sh to evaluate your model.

Our trained model for both tasks can be downloaded from PLCA-trained-model with test mIoU 47.8% and 46.9% (16 classes) respectively.

Citing

Please cite our paper if you use our code in your research:

@inproceedings{kang2020pixel,
  title={Pixel-Level Cycle Association: A New Perspective for Domain Adaptive Semantic Segmentation},
  author={Kang, Guoliang and Wei, Yunchao and Yang, Yi and Zhuang, Yueting and Hauptmann, Alexander G},
  booktitle={NeurIPS},
  year={2020}
}

Contact

If you have any questions, please contact me via kgl.prml@gmail.com.

Thanks to third party

torchvision

LovaszSoftmax

About

Pytorch Implementation for NeurIPS (oral) paper: Pixel Level Cycle Association: A New Perspective for Domain Adaptive Semantic Segmentation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published