Elevation Mapping on GPU.

Overview

Elevation Mapping cupy

Overview

This is a ros package of elevation mapping on GPU.
Code are written in python and uses cupy for GPU calculation.
screenshot

* plane segmentation is coming soon.

Citing

Takahiro Miki, Lorenz Wellhausen, Ruben Grandia, Fabian Jenelten, Timon Homberger, Marco Hutter
Elevation Mapping for Locomotion and Navigation using GPU arXiv

@misc{https://doi.org/10.48550/arxiv.2204.12876,
  doi = {10.48550/ARXIV.2204.12876},
  
  url = {https://arxiv.org/abs/2204.12876},
  
  author = {Miki, Takahiro and Wellhausen, Lorenz and Grandia, Ruben and Jenelten, Fabian and Homberger, Timon and Hutter, Marco},
  
  keywords = {Robotics (cs.RO), FOS: Computer and information sciences, FOS: Computer and information sciences},
  
  title = {Elevation Mapping for Locomotion and Navigation using GPU},
  
  publisher = {arXiv},
  
  year = {2022},
  
  copyright = {arXiv.org perpetual, non-exclusive license}
}

Installation

CUDA & cuDNN

The tested versions are CUDA10.2, 11.6

CUDA
cuDNN.

Check how to install here.

Python dependencies

You will need

For traversability filter, either of

Optinally, opencv for inpainting filter.

Install numpy, scipy, shapely, opencv-python with the following command.

pip3 install -r requirements.txt

Cupy

cupy can be installed with specific CUDA versions. (On jetson, only "from source" i.e. pip install cupy could work)

For CUDA 10.2 pip install cupy-cuda102

For CUDA 11.0 pip install cupy-cuda110

For CUDA 11.1 pip install cupy-cuda111

For CUDA 11.2 pip install cupy-cuda112

For CUDA 11.3 pip install cupy-cuda113

For CUDA 11.4 pip install cupy-cuda114

For CUDA 11.5 pip install cupy-cuda115

For CUDA 11.6 pip install cupy-cuda116

(Install CuPy from source) % pip install cupy

Traversability filter

You can choose either pytorch, or chainer to run the CNN based traversability filter.
Install by following the official documents.

Pytorch uses ~2GB more GPU memory than Chainer, but runs a bit faster.
Use parameter use_chainer to select which backend to use.

ROS package dependencies

sudo apt install ros-noetic-pybind11-catkin
sudo apt install ros-noetic-grid-map-core ros-noetic-grid-map-msgs

On Jetson

CUDA CuDNN

CUDA and cuDNN can be installed via apt. It comes with nvidia-jetpack. The tested version is jetpack 4.5 with L4T 32.5.0.

python dependencies

On jetson, you need the version for its CPU arch:

wget https://nvidia.box.com/shared/static/p57jwntv436lfrd78inwl7iml6p13fzh.whl -O torch-1.8.0-cp36-cp36m-linux_aarch64.whl
pip3 install Cython
pip3 install numpy==1.19.5 torch-1.8.0-cp36-cp36m-linux_aarch64.whl

Also, you need to install cupy with

pip3 install cupy

This builds the packages from source so it would take time.

ROS dependencies

sudo apt install ros-melodic-pybind11-catkin
sudo apt install ros-melodic-grid-map-core ros-melodic-grid-map-msgs

Also, on jetson you need fortran (should already be installed).

sudo apt install gfortran

If the Jetson is set up with Jetpack 4.5 with ROS Melodic the following package is additionally required:

git clone [email protected]:ros/filters.git -b noetic-devel

Usage

Build

catkin build elevation_mapping_cupy

Errors

If you get error such as

Make Error at /usr/share/cmake-3.16/Modules/FindPackageHandleStandardArgs.cmake:146 (message):
  Could NOT find PythonInterp: Found unsuitable version "2.7.18", but
  required is at least "3" (found /usr/bin/python)

Build with option.

catkin build elevation_mapping_cupy -DPYTHON_EXECUTABLE=$(which python3)

Run

Basic usage.

roslaunch elevation_mapping_cupy elevation_mapping_cupy.launch

Run TurtleBot example

First, install turtlebot simulation.

sudo apt install ros-noetic-turtlebot3*

Then, you can run the example.

export TURTLEBOT3_MODEL=waffle
roslaunch elevation_mapping_cupy turtlesim_example.launch

To control the robot with a keyboard, a new terminal window needs to be opened.
Then run

export TURTLEBOT3_MODEL=waffle
roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

Velocity inputs can be sent to the robot by pressing the keys a, w, d, x. To stop the robot completely, press s.

Subscribed Topics

  • topics specified in pointcloud_topics in parameters.yaml ([sensor_msgs/PointCloud2])

    The distance measurements.

  • /tf ([tf/tfMessage])

    The transformation tree.

Published Topics

The topics are published as set in the rosparam.
You can specify which layers to publish in which fps.

Under publishers, you can specify the topic_name, layers basic_layers and fps.

publishers:
  your_topic_name:
    layers: ['list_of_layer_names', 'layer1', 'layer2']             # Choose from 'elevation', 'variance', 'traversability', 'time' + plugin layers
    basic_layers: ['list of basic layers', 'layer1']                # basic_layers for valid cell computation (e.g. Rviz): Choose a subset of `layers`.
    fps: 5.0                                                        # Publish rate. Use smaller value than `map_acquire_fps`.

Example setting in config/parameters.yaml.

  • elevation_map_raw ([grid_map_msg/GridMap])

    The entire elevation map.

  • elevation_map_recordable ([grid_map_msg/GridMap])

    The entire elevation map with slower update rate for visualization and logging.

  • elevation_map_filter ([grid_map_msg/GridMap])

    The filtered maps using plugins.

Plugins

You can create your own plugin to process the elevation map and publish as a layer in GridMap message.

Let's look at the example.

First, create your plugin file in elevation_mapping_cupy/script/plugins/ and save as example.py.

import cupy as cp
from typing import List
from .plugin_manager import PluginBase


class NameOfYourPlugin(PluginBase):
    def __init__(self, add_value:float=1.0, **kwargs):
        super().__init__()
        self.add_value = float(add_value)

    def __call__(self, elevation_map: cp.ndarray, layer_names: List[str],
            plugin_layers: cp.ndarray, plugin_layer_names: List[str])->cp.ndarray:
        # Process maps here
        # You can also use the other plugin's data through plugin_layers.
        new_elevation = elevation_map[0] + self.add_value
        return new_elevation

Then, add your plugin setting to config/plugin_config.yaml

example:                                      # Use the same name as your file name.
  enable: True                                # weather to laod this plugin
  fill_nan: True                              # Fill nans to invalid cells of elevation layer.
  is_height_layer: True                       # If this is a height layer (such as elevation) or not (such as traversability)
  layer_name: "example_layer"                 # The layer name.
  extra_params:                               # This params are passed to the plugin class on initialization.
    add_value: 2.0                            # Example param

Finally, add your layer name to publishers in config/parameters.yaml. You can create a new topic or add to existing topics.

  plugin_example:   # Topic name
    layers: ['elevation', 'example_layer']
    basic_layers: ['example_layer']
    fps: 1.0        # The plugin is called with this fps.
Owner
Robotic Systems Lab - Legged Robotics at ETH Zürich
The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments.
Robotic Systems Lab - Legged Robotics at ETH Zürich
All the code and files related to the MI-Lab of UE19CS305 course in sem 5

Machine-Intelligence-Lab-CS305 The compilation of all the code an drelated files from MI-Lab UE19CS305 (of batch 2019-2023) offered by PES University

Arvind Krishna 3 Nov 10, 2022
PyMatting: A Python Library for Alpha Matting

Given an input image and a hand-drawn trimap (top row), alpha matting estimates the alpha channel of a foreground object which can then be composed onto a different background (bottom row).

PyMatting 1.4k Dec 30, 2022
YOLOv7 - Framework Beyond Detection

🔥🔥🔥🔥 YOLO with Transformers and Instance Segmentation, with TensorRT acceleration! 🔥🔥🔥

JinTian 3k Jan 01, 2023
A privacy-focused, intelligent security camera system.

Self-Hosted Home Security Camera System A privacy-focused, intelligent security camera system. Features: Multi-camera support w/ minimal configuration

Scott Barnes 175 Jan 01, 2023
CTRL-C: Camera calibration TRansformer with Line-Classification

CTRL-C: Camera calibration TRansformer with Line-Classification This repository contains the official code and pretrained models for CTRL-C (Camera ca

57 Nov 14, 2022
training script for space time memory network

Trainig Script for Space Time Memory Network This codebase implemented training code for Space Time Memory Network with some cyclic features. Requirem

Yuxi Li 100 Dec 20, 2022
Benchmarks for semi-supervised domain generalization.

Semi-Supervised Domain Generalization This code is the official implementation of the following paper: Semi-Supervised Domain Generalization with Stoc

Kaiyang 49 Dec 10, 2022
Towards Calibrated Model for Long-Tailed Visual Recognition from Prior Perspective

Towards Calibrated Model for Long-Tailed Visual Recognition from Prior Perspective Zhengzhuo Xu, Zenghao Chai, Chun Yuan This is the PyTorch implement

Sincere 16 Dec 15, 2022
[CVPR 2021] Exemplar-Based Open-Set Panoptic Segmentation Network (EOPSN)

EOPSN: Exemplar-Based Open-Set Panoptic Segmentation Network (CVPR 2021) PyTorch implementation for EOPSN. We propose open-set panoptic segmentation t

Jaedong Hwang 49 Dec 30, 2022
My freqtrade strategies

My freqtrade-strategies Hi there! This is repo for my freqtrade-strategies. My name is Ilya Zelenchuk, I'm a lecturer at the SPbU university (https://

171 Dec 05, 2022
DeepCAD: A Deep Generative Network for Computer-Aided Design Models

DeepCAD This repository provides source code for our paper: DeepCAD: A Deep Generative Network for Computer-Aided Design Models Rundi Wu, Chang Xiao,

Rundi Wu 85 Dec 31, 2022
Automated Evidence Collection for Fake News Detection

Automated Evidence Collection for Fake News Detection This is the code repo for the Automated Evidence Collection for Fake News Detection paper accept

Mrinal Rawat 2 Apr 12, 2022
Code for Towards Streaming Perception (ECCV 2020) :car:

sAP — Code for Towards Streaming Perception ECCV Best Paper Honorable Mention Award Feb 2021: Announcing the Streaming Perception Challenge (CVPR 2021

Martin Li 85 Dec 22, 2022
Torch implementation of SegNet and deconvolutional network

Torch implementation of SegNet and deconvolutional network

Fedor Chervinskii 5 Jul 17, 2020
i-RevNet Pytorch Code

i-RevNet: Deep Invertible Networks Pytorch implementation of i-RevNets. i-RevNets define a family of fully invertible deep networks, built from a succ

Jörn Jacobsen 378 Dec 06, 2022
STEM: An approach to Multi-source Domain Adaptation with Guarantees

STEM: An approach to Multi-source Domain Adaptation with Guarantees Introduction This is the official implementation of ``STEM: An approach to Multi-s

5 Dec 19, 2022
TAPEX: Table Pre-training via Learning a Neural SQL Executor

TAPEX: Table Pre-training via Learning a Neural SQL Executor The official repository which contains the code and pre-trained models for our paper TAPE

Microsoft 157 Dec 28, 2022
Model Quantization Benchmark

Introduction MQBench is an open-source model quantization toolkit based on PyTorch fx. The envision of MQBench is to provide: SOTA Algorithms. With MQ

500 Jan 06, 2023
The code for Expectation-Maximization Attention Networks for Semantic Segmentation (ICCV'2019 Oral)

EMANet News The bug in loading the pretrained model is now fixed. I have updated the .pth. To use it, download it again. EMANet-101 gets 80.99 on the

Xia Li 李夏 663 Nov 30, 2022
Implementation of the federated dual coordinate descent (FedDCD) method.

FedDCD.jl Implementation of the federated dual coordinate descent (FedDCD) method. Installation To install, just call Pkg.add("https://github.com/Zhen

Zhenan Fan 6 Sep 21, 2022