Deep Implicit Moving Least-Squares Functions for 3D Reconstruction

Related tags

Deep LearningDeepMLS
Overview

DeepMLS: Deep Implicit Moving Least-Squares Functions for 3D Reconstruction

This repository contains the implementation of the paper:

Deep Implicit Moving Least-Squares Functions for 3D Reconstruction [arXiv]
Shi-Lin Liu, Hao-Xiang Guo, Hao Pan, Pengshuai Wang, Xin Tong, Yang Liu.

If you find our code or paper useful, please consider citing

@inproceedings{Liu2021MLS,
 author =  {Shi-Lin Liu, Hao-Xiang Guo, Hao Pan, Pengshuai Wang, Xin Tong, Yang Liu},
 title = {Deep Implicit Moving Least-Squares Functions for 3D Reconstruction},
 year = {2021}}

Installation

First you have to make sure that you have all dependencies in place. The simplest way to do so, is to use anaconda.

You can create an anaconda environment called deep_mls using

conda env create -f environment.yml
conda activate deep_mls

Next, a few customized tensorflow modules should be installed:

O-CNN Module

O-CNN is an octree-based convolution module, please take the following steps to install:

cd Octree && git clone https://github.com/microsoft/O-CNN/
cd O-CNN/octree/external && git clone --recursive https://github.com/wang-ps/octree-ext.git
cd .. && mkdir build && cd build
cmake ..  && cmake --build . --config Release
export PATH=`pwd`:$PATH
cd ../../tensorflow/libs && python build.py --cuda /usr/local/cuda-10.0
cp libocnn.so ../../../ocnn-tf/libs

Efficient Neighbor Searching Ops

Neighbor searching is intensively used in DeepMLS. For efficiency reasons, we provide several customized neighbor searching ops:

cd points3d-tf/points3d
bash build.sh

In this step, some errors like this may occur:

tensorflow_core/include/tensorflow/core/util/gpu_kernel_helper.h:22:10: fatal error: third_party/gpus/cuda/include/cuda_fp16.h: No such file or directory
 #include "third_party/gpus/cuda/include/cuda_fp16.h"

For solving this, please refer to issue.
Basically, We need to edit the codes in tensorflow framework, please modify

#include "third_party/gpus/cuda/include/cuda_fp16.h"

in "site-packages/tensorflow_core/include/tensorflow/core/util/gpu_kernel_helper.h" to

#include "cuda_fp16.h"

and

#include "third_party/gpus/cuda/include/cuComplex.h"
#include "third_party/gpus/cuda/include/cuda.h"

in "site-packages/tensorflow_core/include/tensorflow/core/util/gpu_device_functions.h" to

#include "cuComplex.h"
#include "cuda.h"

Modified Marching Cubes Module

We have modified the PyMCubes to get a more efficient marching cubes method for extract 0-isosurface defined by mls points.
To install:

git clone https://github.com/Andy97/PyMCubes
cd PyMCubes && python setup.py install

Datasets

Preprocessed ShapeNet Dataset

We have provided the processed tfrecords file. This can be used directly.

Our training data is available now! (total 130G+)
Please download all zip files for extraction.
ShapeNet_points_all_train.zip.001
ShapeNet_points_all_train.zip.002
ShapeNet_points_all_train.zip.003
After extraction, please modify the "train_data" field in experiment config json file with this tfrecords name.

Build the Dataset

If you want to build the dataset from your own data, please follow:

Step 1: Get Watertight Meshes

To acquire a watertight mesh, we first preprocess each mesh follow the preprocess steps of Occupancy Networks.

Step 2: Get the groundtruth sdf pair

From step 1, we have already gotten the watertight version of each model. Then, we utilize OpenVDB library to get the sdf values and gradients for training.
For details, please refer to here.

Usage

Inference using pre-trained model

We have provided pretrained models which can be used to inference:

#first download the pretrained models
cd Pretrained && python download_models.py
#then we can use either of the pretrained model to do the inference
cd .. && python DeepMLS_Generation.py Pretrained/Config_d7_1p_pretrained.json --test

The input for the inference is defined in here.
Your can replace it with other point cloud files in examples or your own data.

Extract Isosurface from MLS Points

After inference, now we have network predicted mls points. The next step is to extract the surface:

python mls_marching_cubes.py --i examples/d0fa70e45dee680fa45b742ddc5add59.ply.xyz --o examples/d0fa70e45dee680fa45b742ddc5add59_mc.obj --scale

Training

Our code supports single and multiple gpu training. For details, please refer to the config json file.

python DeepMLS_Generation.py examples/Config_g2_bs32_1p_d6.json

Evaluation

For evaluation of results, ConvONet has provided a great script. Please refer to here.

A task Provided by A respective Artenal Ai and Ml based Company to complete it

A task Provided by A respective Alternal Ai and Ml based Company to complete it .

Parth Madan 1 Jan 25, 2022
The implementation of "Shuffle Transformer: Rethinking Spatial Shuffle for Vision Transformer"

Shuffle Transformer The implementation of "Shuffle Transformer: Rethinking Spatial Shuffle for Vision Transformer" Introduction Very recently, window-

87 Nov 29, 2022
A stock generator that assess a list of stocks and returns the best stocks for investing and money allocations based on users choices of volatility, duration and number of stocks

Stock-Generator Please visit "Stock Generator.ipynb" for a clearer view and "Stock Generator.py" for scripts. The stock generator is designed to allow

jmengnyay 1 Aug 02, 2022
Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021

PLOME:Pre-training with Misspelled Knowledge for Chinese Spelling Correction (ACL2021) This repository provides the code and data of the work in ACL20

197 Nov 26, 2022
Pytorch implementation for ACMMM2021 paper "I2V-GAN: Unpaired Infrared-to-Visible Video Translation".

I2V-GAN This repository is the official Pytorch implementation for ACMMM2021 paper "I2V-GAN: Unpaired Infrared-to-Visible Video Translation". Traffic

69 Dec 31, 2022
AlphaNet Improved Training of Supernet with Alpha-Divergence

AlphaNet: Improved Training of Supernet with Alpha-Divergence This repository contains our PyTorch training code, evaluation code and pretrained model

Facebook Research 87 Oct 10, 2022
Autonomous Driving on Curvy Roads without Reliance on Frenet Frame: A Cartesian-based Trajectory Planning Method

C++/ROS Source Codes for "Autonomous Driving on Curvy Roads without Reliance on Frenet Frame: A Cartesian-based Trajectory Planning Method" published in IEEE Trans. Intelligent Transportation Systems

Bai Li 88 Dec 23, 2022
Code for the paper One Thing One Click: A Self-Training Approach for Weakly Supervised 3D Semantic Segmentation, CVPR 2021.

One Thing One Click One Thing One Click: A Self-Training Approach for Weakly Supervised 3D Semantic Segmentation (CVPR2021) Code for the paper One Thi

44 Dec 12, 2022
fcn by tensorflow

Update An example on how to integrate this code into your own semantic segmentation pipeline can be found in my KittiSeg project repository. tensorflo

9 May 22, 2022
Segmentation and Identification of Vertebrae in CT Scans using CNN, k-means Clustering and k-NN

Segmentation and Identification of Vertebrae in CT Scans using CNN, k-means Clustering and k-NN If you use this code for your research, please cite ou

41 Dec 08, 2022
Code + pre-trained models for the paper Keeping Your Eye on the Ball Trajectory Attention in Video Transformers

Motionformer This is an official pytorch implementation of paper Keeping Your Eye on the Ball: Trajectory Attention in Video Transformers. In this rep

Facebook Research 192 Dec 23, 2022
This project is based on our SIGGRAPH 2021 paper, ROSEFusion: Random Optimization for Online DenSE Reconstruction under Fast Camera Motion .

ROSEFusion 🌹 This project is based on our SIGGRAPH 2021 paper, ROSEFusion: Random Optimization for Online DenSE Reconstruction under Fast Camera Moti

219 Dec 27, 2022
Code for "R-GCN: The R Could Stand for Random"

RR-GCN: Random Relational Graph Convolutional Networks PyTorch Geometric code for the paper "R-GCN: The R Could Stand for Random" RR-GCN is an extensi

PreDiCT.IDLab 31 Sep 07, 2022
Demos of essentia classifiers hosted on replicate.ai

essentia-replicate-demos Demos of Essentia models hosted on replicate.ai's MTG site. The models Check our site for a complete list of the models avail

Music Technology Group - Universitat Pompeu Fabra 12 Nov 14, 2022
official implemntation for "Contrastive Learning with Stronger Augmentations"

CLSA CLSA is a self-supervised learning methods which focused on the pattern learning from strong augmentations. Copyright (C) 2020 Xiao Wang, Guo-Jun

Lab for MAchine Perception and LEarning (MAPLE) 47 Nov 29, 2022
R3Det based on mmdet 2.19.0

R3Det: Refined Single-Stage Detector with Feature Refinement for Rotating Object Installation # install mmdetection first if you haven't installed it

SJTU-Thinklab-Det 38 Dec 15, 2022
Implementation of the state of the art beat-detection, downbeat-detection and tempo-estimation model

The ISMIR 2020 Beat Detection, Downbeat Detection and Tempo Estimation Model Implementation. This is an implementation in TensorFlow to implement the

Koen van den Brink 1 Nov 12, 2021
Quantized tflite models for ailia TFLite Runtime

ailia-models-tflite Quantized tflite models for ailia TFLite Runtime About ailia TFLite Runtime ailia TF Lite Runtime is a TensorFlow Lite compatible

ax Inc. 13 Dec 23, 2022
Automatic caption evaluation metric based on typicality analysis.

SeMantic and linguistic UndeRstanding Fusion (SMURF) Automatic caption evaluation metric described in the paper "SMURF: SeMantic and linguistic UndeRs

Joshua Feinglass 6 Jan 09, 2022