Isaac Gym Environments for Legged Robots

Related tags

Hardwarelegged_gym
Overview

Isaac Gym Environments for Legged Robots

This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain using NVIDIA's Isaac Gym. It includes all components needed for sim-to-real transfer: actuator network, friction & mass randomization, noisy observations and random pushes during training.
Maintainer: Nikita Rudin
Affiliation: Robotic Systems Lab, ETH Zurich
Contact: [email protected]

Useful Links

Project website: https://leggedrobotics.github.io/legged_gym/ Paper: https://arxiv.org/abs/2109.11978

Installation

  1. Create a new python virtual env with python 3.6, 3.7 or 3.8 (3.8 recommended)
  2. Install pytorch 1.10 with cuda-11.3:
    • pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
  3. Install Isaac Gym
    • Download and install Isaac Gym Preview 3 (Preview 2 will not work!) from https://developer.nvidia.com/isaac-gym
    • cd isaacgym_lib/python && pip install -e .
    • Try running an example python examples/1080_balls_of_solitude.py
    • For troubleshooting check docs isaacgym/docs/index.html)
  4. Install rsl_rl (PPO implementation)
  5. Install legged_gym
    • Clone this repository
    • cd legged_gym && git checkout develop && pip install -e .

CODE STRUCTURE

  1. Each environment is defined by an env file (legged_robot.py) and a config file (legged_robot_config.py). The config file contains two classes: one conatianing all the environment parameters (LeggedRobotCfg) and one for the training parameters (LeggedRobotCfgPPo).
  2. Both env and config classes use inheritance.
  3. Each non-zero reward scale specified in cfg will add a function with a corresponding name to the list of elements which will be summed to get the total reward.
  4. Tasks must be registered using task_registry.register(name, EnvClass, EnvConfig, TrainConfig). This is done in envs/__init__.py, but can also be done from outside of this repository.

Usage

  1. Train:
    python issacgym_anymal/scripts/train.py --task=anymal_c_flat
    • To run on CPU add following arguments: --sim_device=cpu, --rl_device=cpu (sim on CPU and rl on GPU is possible).
    • To run headless (no rendering) add --headless.
    • Important: To improve performance, once the training starts press v to stop the rendering. You can then enable it later to check the progress.
    • The trained policy is saved in issacgym_anymal/logs/ / _ /model_ .pt . Where and are defined in the train config.
    • The following command line arguments override the values set in the config files:
    • --task TASK: Task name.
    • --resume: Resume training from a checkpoint
    • --experiment_name EXPERIMENT_NAME: Name of the experiment to run or load.
    • --run_name RUN_NAME: Name of the run.
    • --load_run LOAD_RUN: Name of the run to load when resume=True. If -1: will load the last run.
    • --checkpoint CHECKPOINT: Saved model checkpoint number. If -1: will load the last checkpoint.
    • --num_envs NUM_ENVS: Number of environments to create.
    • --seed SEED: Random seed.
    • --max_iterations MAX_ITERATIONS: Maximum number of training iterations.
  2. Play a trained policy:
    python issacgym_anymal/scripts/play.py --task=anymal_c_flat
    • By default the loaded policy is the last model of the last run of the experiment folder.
    • Other runs/model iteration can be selected by setting load_run and checkpoint in the train config.

Adding a new environment

The base environment legged_robot implements a rough terrain locomotion task. The corresponding cfg does not specify a robot asset (URDF/ MJCF) and no reward scales.

  1. Add a new folder to envs/ with ' _config.py , which inherit from an existing environment cfgs
  2. If adding a new robot:
    • Add the corresponding assets to resourses/.
    • In cfg set the asset path, define body names, default_joint_positions and PD gains. Specify the desired train_cfg and the name of the environment (python class).
    • In train_cfg set experiment_name and run_name
  3. (If needed) implement your environment in .py, inherit from an existing environment, overwrite the desired functions and/or add your reward functions.
  4. Register your env in isaacgym_anymal/envs/__init__.py.
  5. Modify/Tune other parameters in your cfg, cfg_train as needed. To remove a reward set its scale to zero. Do not modify parameters of other envs!

Troubleshooting

  1. If you get the following error: ImportError: libpython3.8m.so.1.0: cannot open shared object file: No such file or directory, do: sudo apt install libpython3.8

Known Issues

  1. The contact forces reported by net_contact_force_tensor are unreliable when simulating on GPU with a triangle mesh terrain. A workaround is to use force sensors, but the force are propagated through the sensors of consecutive bodies resulting in an undesireable behaviour. However, for a legged robot it is possible to add sensors to the feet/end effector only and get the expected results. When using the force sensors make sure to exclude gravity from trhe reported forces with sensor_options.enable_forward_dynamics_forces. Example:
    sensor_pose = gymapi.Transform()
    for name in feet_names:
        sensor_options = gymapi.ForceSensorProperties()
        sensor_options.enable_forward_dynamics_forces = False # for example gravity
        sensor_options.enable_constraint_solver_forces = True # for example contacts
        sensor_options.use_world_frame = True # report forces in world frame (easier to get vertical components)
        index = self.gym.find_asset_rigid_body_index(robot_asset, name)
        self.gym.create_asset_force_sensor(robot_asset, index, sensor_pose, sensor_options)
    (...)

    sensor_tensor = self.gym.acquire_force_sensor_tensor(self.sim)
    self.gym.refresh_force_sensor_tensor(self.sim)
    force_sensor_readings = gymtorch.wrap_tensor(sensor_tensor)
    self.sensor_forces = force_sensor_readings.view(self.num_envs, 4, 6)[..., :3]
    (...)

    self.gym.refresh_force_sensor_tensor(self.sim)
    contact = self.sensor_forces[:, :, 2] > 1.
Owner
Robotic Systems Lab - Legged Robotics at ETH Zürich
The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments.
Robotic Systems Lab - Legged Robotics at ETH Zürich
The PicoEMP is a low-cost Electromagnetic Fault Injection (EMFI) tool,

ChipSHOUTER-PicoEMP The PicoEMP is a low-cost Electromagnetic Fault Injection (EMFI) tool, designed specifically for self-study and hobbiest research.

NewAE Technology Inc. 312 Jan 07, 2023
Turns a compatible Raspberry Pi device into a smart USB drive for PS4/PS5.

PSBerry A WIP project for Raspberry Pi, which turns a compatible RPI device into a smart USB drive for PS4/PS5. Allows for save management of PS4 game

Filip Tomaszewski 2 Jan 15, 2022
The robot is an autonomous small scale racing car using NVIDIA Jetson Nano.

The robot is an autonomous small scale racing car using NVIDIA Jetson Nano. This project utilizes deep learning neural network framework Keras/Tensorflow, together with computer vision library OpenCV

1 Dec 08, 2021
Huawei Solar sensors for Home Assistant

Huawei Solar Sensors This integration splits out the various values that are fetched from your Huawei Solar inverter into separate HomeAssistant senso

Thijs Walcarius 151 Dec 31, 2022
Turn your Raspberry Pi Pico into a USB Rubber Ducky

pico-ducky Turn your Raspberry Pi Pico into a USB Rubber Ducky Install Requirements CircuitPython for the Raspberry Pi Pico adafruit-circuitpython-bun

Konstantinos 5 Nov 08, 2022
CO2Ampel - This RaspberryPi project uses weather data to estimate the share of renewable energy in the power grid

CO2Ampel This RaspberryPi project uses weather data to estimate the share of ren

Felix 4 Jan 19, 2022
DOS-like OS for RP2040 basic microcontroller boards

Micropython DOS-like OS for RP2040 microcontroller boards. Check out the demo video at https://www.youtube.com/watch?v=Az_oiq8GE4Y To start the OS typ

RetiredWizard 58 Dec 27, 2022
Zev es un Bot/Juego RPG de Discord creado en y para aprender Python.

Zev es un Bot/Juego RPG de Discord creado en y para aprender Python.

Julen Smith 3 Jan 12, 2022
Hardware: CTWingSKIT_BC28 Development Toolkit

IoT Portal Monitor Tools hardware: CTWingSKIT_BC28 Development Toolkit serial port driver: ST-LINK hardware development environment: Keli 5 MDK IoT pl

Fengming Zhang 1 Nov 07, 2021
Custom component for MPC-HC for home-assistant

mpc_hc The current mpchc integration in homeassistant violates ADR0004, so it will be deleted from core. This is just the existing integration copied

3 Dec 15, 2022
Simple python3 implementation of microKanren with lots of type annotations for clarity

MicroKanren-py This is (yet another) python implementation of microKanren. It's a reasonably 1:1 translation of the code provided in the paper, but ev

Erik Derohanian 3 Dec 10, 2022
Trajectory optimization package for Mini-Pupper robot

Trajectory optimization package for Mini-Pupper robot Purpose of this repository is to provide low-torque and low-impact trajectory for Mini-Pupper qu

Sotaro Katayama 38 Aug 17, 2022
ArduinoWaterHeaterIOT - IoT Probe of a solar PV water heating system - Arduino, Python, MQTT, MySQL

ArduinoWaterHeaterIOT IoT Probe of a solar PV water heating system - Arduino, Raspberry Pi, Python, MQTT, MySQL The Arduino sends the AC and DC watts

Jacques Fourie 1 Jan 11, 2022
A python script to poll RPi GPIO pins and subscribe and publish their state via MQTT

MQTT-GPIO A python script to poll RPi GPIO pins and subscribe and publish their state via MQTT using TLS. This script is short and meant to be edited

23 Oct 12, 2021
An IoT Trivia app that shows you how to take a JSON web API such as the opentdb.com API and stream and display it on a FeatherS2 in an OLED display.

CircuitPython IoT Trivia ESP32-S2 OLED Version An IoT Trivia app that shows you how to take a JSON web API such as the opentdb.com API and stream and

Kevin Thomas 1 Nov 27, 2021
Sleep As Android integration for Home Assistant

Sleep As Android custom integration This integration will allow you to get events from your SleepAsAndroid application in a form of the sensor states

Igor 84 Dec 30, 2022
Philippe 1 Jan 09, 2022
Real-time Coastal Monitoring at the University of Hawaii at Manoa

Coastal Monitoring at the University of Manoa Source code for Beaglebone/RPi-based data loggers, shore internet gateways, and web server. Software dev

Stanley Lio 7 Dec 07, 2021
A dashboard for Raspberry Pi to display environmental weather data, rain radar, weather forecast, etc. written in Python

Weather Clock for Raspberry PI This project is a dashboard for Raspberry Pi to display environmental weather data, rain radar, weather forecast, etc.

Markus Geiger 1 May 01, 2022
Parametric open source reconstructions of Voron printed parts

The Parametric Voron This repository contains Fusion 360 reconstructions of various printed parts from the Voron printers

Matthew Lloyd 26 Dec 19, 2022