Skip to content

[AAAI 2021] EMLight: Lighting Estimation via Spherical Distribution Approximation, [TIP] GMLight: Lighting Estimation via Geometric Distribution Approximation

fnzhan/EMLight

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EMLight: Lighting Estimation via Spherical Distribution Approximation (AAAI 2021)

Teaser

Update

  • 12/2021: We release our Virtual Object Relighting (VOR) Dataset for lighting estimation evaluation. Please refer to Virtual Object Insertion & Rendering section.
  • 07/2021: We include a new Needlets basis for lighting representation which allows to represent illumination in both spatial and frequency domains. The implementation code is available in Needlets/ of this repository.

Prerequisites

  • Linux or macOS
  • Python3, PyTorch
  • CPU or NVIDIA GPU + CUDA CuDNN

Dataset Preparation

Laval Indoor HDR Dataset
Thanks to the intellectual property of Laval Indoor dataset, the original datasets and processed training data can not be released from me. Please get access to the dataset by contacting the dataset creator jflalonde@gel.ulaval.ca. Note, I request the processed version directly, with cropped HDR as input image, while the original dataset is panorama without local cropping.

After getting the dataset, the panoramic illumination map can be processed to generate the training data of the regression network as below:

cd RegressionNetwork/representation/
python3 distribution_representation.py

Pretrained Models

The pretrained regression model of EMLight as well as pretrained densenet-121 can be downloaded from Google Drive. Saving the pretrained models in RegressionNetwork/checkpoints.

Training

Newest Update (Jan-2022)

The sigmoid in the output layers in DenseNet.py should be deleted. To avoid complex learning rate scheduling, I fix the learning rate to 0.0001 in the overfitting stage. The model is trained on subsets of 100, 1000, 2500, ... and the full set gradually. If you find the prediction get stuck in some points (may happen occasionally), you should stop it and load the weights trained on previous subset to retrain it.

Run the command

cd RegressionNetwork/
python3 train.py

Training tip1: you may overfit the model on a small subset first, then train the model on the full set if there is divergence during training.

Training tip2: you can try to reduce the number of anchor points (e.g., 96) in the model, which helps to converge.

Virtual Object Insertion & Rendering

To evaluate the performance of lighting estimation, we create a Virtual Object Relighting (VOR) dataset to conduct object insertion & rendering in Blender. The lighting estimaiton performance is evaluated by using the predicted illumination map as the environment light in Blender.

The background scenes of this set include images from Laval Indoor HDR, Fast Spatially-Varying Indoor, and some wild scenes. This dataset can be downloaded from Google Drive.

Teaser

Citation

If you use this code for your research, please cite our papers.

@inproceedings{zhan2021emlight,
  title={EMLight: Lighting Estimation via Spherical Distribution Approximation},
  author={Zhan, Fangneng and Zhang, Changgong and Yu, Yingchen and Chang, Yuan and Lu, Shijian and Ma, Feiying and Xie, Xuansong},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2021}
}
@article{zhan2022gmlight,
  title={Gmlight: Lighting estimation via geometric distribution approximation},
  author={Zhan, Fangneng and Yu, Yingchen and Zhang, Changgong and Wu, Rongliang and Hu, Wenbo and Lu, Shijian and Ma, Feiying and Xie, Xuansong and Shao, Ling},
  journal={IEEE Transactions on Image Processing},
  volume={31},
  pages={2268--2278},
  year={2022},
  publisher={IEEE}
}

About

[AAAI 2021] EMLight: Lighting Estimation via Spherical Distribution Approximation, [TIP] GMLight: Lighting Estimation via Geometric Distribution Approximation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published