Isaac Gym Environments for Legged Robots

Related tags

Hardwarelegged_gym
Overview

Isaac Gym Environments for Legged Robots

This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain using NVIDIA's Isaac Gym. It includes all components needed for sim-to-real transfer: actuator network, friction & mass randomization, noisy observations and random pushes during training.
Maintainer: Nikita Rudin
Affiliation: Robotic Systems Lab, ETH Zurich
Contact: [email protected]

Useful Links

Project website: https://leggedrobotics.github.io/legged_gym/ Paper: https://arxiv.org/abs/2109.11978

Installation

  1. Create a new python virtual env with python 3.6, 3.7 or 3.8 (3.8 recommended)
  2. Install pytorch 1.10 with cuda-11.3:
    • pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
  3. Install Isaac Gym
    • Download and install Isaac Gym Preview 3 (Preview 2 will not work!) from https://developer.nvidia.com/isaac-gym
    • cd isaacgym_lib/python && pip install -e .
    • Try running an example python examples/1080_balls_of_solitude.py
    • For troubleshooting check docs isaacgym/docs/index.html)
  4. Install rsl_rl (PPO implementation)
  5. Install legged_gym
    • Clone this repository
    • cd legged_gym && git checkout develop && pip install -e .

CODE STRUCTURE

  1. Each environment is defined by an env file (legged_robot.py) and a config file (legged_robot_config.py). The config file contains two classes: one conatianing all the environment parameters (LeggedRobotCfg) and one for the training parameters (LeggedRobotCfgPPo).
  2. Both env and config classes use inheritance.
  3. Each non-zero reward scale specified in cfg will add a function with a corresponding name to the list of elements which will be summed to get the total reward.
  4. Tasks must be registered using task_registry.register(name, EnvClass, EnvConfig, TrainConfig). This is done in envs/__init__.py, but can also be done from outside of this repository.

Usage

  1. Train:
    python issacgym_anymal/scripts/train.py --task=anymal_c_flat
    • To run on CPU add following arguments: --sim_device=cpu, --rl_device=cpu (sim on CPU and rl on GPU is possible).
    • To run headless (no rendering) add --headless.
    • Important: To improve performance, once the training starts press v to stop the rendering. You can then enable it later to check the progress.
    • The trained policy is saved in issacgym_anymal/logs/ / _ /model_ .pt . Where and are defined in the train config.
    • The following command line arguments override the values set in the config files:
    • --task TASK: Task name.
    • --resume: Resume training from a checkpoint
    • --experiment_name EXPERIMENT_NAME: Name of the experiment to run or load.
    • --run_name RUN_NAME: Name of the run.
    • --load_run LOAD_RUN: Name of the run to load when resume=True. If -1: will load the last run.
    • --checkpoint CHECKPOINT: Saved model checkpoint number. If -1: will load the last checkpoint.
    • --num_envs NUM_ENVS: Number of environments to create.
    • --seed SEED: Random seed.
    • --max_iterations MAX_ITERATIONS: Maximum number of training iterations.
  2. Play a trained policy:
    python issacgym_anymal/scripts/play.py --task=anymal_c_flat
    • By default the loaded policy is the last model of the last run of the experiment folder.
    • Other runs/model iteration can be selected by setting load_run and checkpoint in the train config.

Adding a new environment

The base environment legged_robot implements a rough terrain locomotion task. The corresponding cfg does not specify a robot asset (URDF/ MJCF) and no reward scales.

  1. Add a new folder to envs/ with ' _config.py , which inherit from an existing environment cfgs
  2. If adding a new robot:
    • Add the corresponding assets to resourses/.
    • In cfg set the asset path, define body names, default_joint_positions and PD gains. Specify the desired train_cfg and the name of the environment (python class).
    • In train_cfg set experiment_name and run_name
  3. (If needed) implement your environment in .py, inherit from an existing environment, overwrite the desired functions and/or add your reward functions.
  4. Register your env in isaacgym_anymal/envs/__init__.py.
  5. Modify/Tune other parameters in your cfg, cfg_train as needed. To remove a reward set its scale to zero. Do not modify parameters of other envs!

Troubleshooting

  1. If you get the following error: ImportError: libpython3.8m.so.1.0: cannot open shared object file: No such file or directory, do: sudo apt install libpython3.8

Known Issues

  1. The contact forces reported by net_contact_force_tensor are unreliable when simulating on GPU with a triangle mesh terrain. A workaround is to use force sensors, but the force are propagated through the sensors of consecutive bodies resulting in an undesireable behaviour. However, for a legged robot it is possible to add sensors to the feet/end effector only and get the expected results. When using the force sensors make sure to exclude gravity from trhe reported forces with sensor_options.enable_forward_dynamics_forces. Example:
    sensor_pose = gymapi.Transform()
    for name in feet_names:
        sensor_options = gymapi.ForceSensorProperties()
        sensor_options.enable_forward_dynamics_forces = False # for example gravity
        sensor_options.enable_constraint_solver_forces = True # for example contacts
        sensor_options.use_world_frame = True # report forces in world frame (easier to get vertical components)
        index = self.gym.find_asset_rigid_body_index(robot_asset, name)
        self.gym.create_asset_force_sensor(robot_asset, index, sensor_pose, sensor_options)
    (...)

    sensor_tensor = self.gym.acquire_force_sensor_tensor(self.sim)
    self.gym.refresh_force_sensor_tensor(self.sim)
    force_sensor_readings = gymtorch.wrap_tensor(sensor_tensor)
    self.sensor_forces = force_sensor_readings.view(self.num_envs, 4, 6)[..., :3]
    (...)

    self.gym.refresh_force_sensor_tensor(self.sim)
    contact = self.sensor_forces[:, :, 2] > 1.
Owner
Robotic Systems Lab - Legged Robotics at ETH Zürich
The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments.
Robotic Systems Lab - Legged Robotics at ETH Zürich
Port of Uxn to digital hardware in the Logisim simulator

Uxn-Logisim Implements the Uxn instruction set in digital hardware. Very WIP. Contents cpu.circ - The Logisim file microcode.mc - Microcode source fil

DeltaF1 11 Mar 27, 2022
Simple Weather Check base on Hefeng api, Work on raspberry Pi

Simple Weather Check base on Hefeng api, Work on raspberry Pi

Retr0mous 28 Sep 17, 2022
Extremely simple PyBadge examples to demonstrate different aspects of CircuitPython using PyBadge hardware.

BeginnerPyBadge I purchased a PyBadge recently. I'm new to hardware. I was surprised how hard it was to find easy examples demonstrating how different

Rubini LaForest 2 Oct 21, 2021
Self Driving Car Prototype

Package Delivery Rover 🚀 This project is a prototype of Self Driving Car. It's based on embedded systems, to meet the current requirement of delivery

Abhishek Pawar 1 Oct 31, 2021
The goal of this project is for anyone with an old printer to be able to double-sided printing.

Welcome to PDF-double-side! Hi! I'm 15. I have a old printer so I can't print double-sided outs. The goal of this project is for anyone with an old pr

DejaVu 4 Dec 28, 2021
Parametric open source reconstructions of Voron printed parts

The Parametric Voron This repository contains Fusion 360 reconstructions of various printed parts from the Voron printers

Matthew Lloyd 26 Dec 19, 2022
A simple Python script for toggling Philips Hue Lights by clapping

LightsClap A simple Python script for toggling Philips Hue Lights by clapping Usage pip3 install -r requirements.txt python3 main.py and press the Ent

Flux Industries 2 Nov 16, 2021
The software that powers the sPot: a 4th generation

This code is meant to accompany this project in which a Spotify client is built into an iPod "Classic" from 2004. Everything is meant to run on a Raspberry Pi Zero W.

Guy Dupont 683 Dec 28, 2022
A simple non-official manager interface I'm using for my Raspberry Pis.

My Raspberry Pi Manager Overview I have two Raspberry Pi 4 Model B devices that I hooked up to my two TVs (one in my bedroom and the other in my new g

Christian Deacon 21 Jan 04, 2023
LUNA: a USB multitool & nMigen library

LUNA is a full toolkit for working with USB using FPGA technology; and provides hardware, gateware, and software to enable USB applications.

Great Scott Gadgets 750 Dec 28, 2022
Turns a compatible Raspberry Pi device into a smart USB drive for PS4/PS5.

PSBerry A WIP project for Raspberry Pi, which turns a compatible RPI device into a smart USB drive for PS4/PS5. Allows for save management of PS4 game

Filip Tomaszewski 2 Jan 15, 2022
Example for Calculating Robot Dynamics Using Pinocchio Library

A Example for Calculating Robot Dynamics Using Pinocchio Library Developed by: Xinyang Tian. Platform: Linux + Pinocchio. In this work, i use Pinocchi

Rot_Tianers 33 Dec 28, 2022
Real-time Coastal Monitoring at the University of Hawaii at Manoa

Coastal Monitoring at the University of Manoa Source code for Beaglebone/RPi-based data loggers, shore internet gateways, and web server. Software dev

Stanley Lio 7 Dec 07, 2021
Raspberry Pi Spectrometer

PySpectrometer 2021-03-05 Raspberry Pi Spectrometer The PySpectrometer is a Python (OpenCV and Tkinter) implementation of an optical spectrometer. The

Les Wright 538 Jan 05, 2023
Baseline model for Augmented Home Assistant

Dataset Preparation Step 1. Rename the Virtual-Home output directory to 'vh.[name]', for example: 'vh.door' Make sure the directory contains 100+ fram

Stanford HCI 1 Aug 24, 2022
GUI wrapper designed for convenient service work with TI CC1352/CC2538/CC2652 based Zigbee sticks or gateways. Packed into single executable file

ZigStar GW Multi tool is GUI wrapper firtsly designed for convenient service work with Zig Star LAN GW, but now supports any TI CC1352/CC2538/CC2652 b

133 Jan 01, 2023
ENC28J60 Ethernet chip driver for MicroPython (RP2)

micropy-ENC28J60 ENC28J60 Ethernet chip driver for MicroPython v1.17 (RP2) Rationale ENC28J60 is a popular and cheap module for DIY projects. At the m

11 Nov 16, 2022
An open source operating system designed primarily for the Raspberry Pi Pico, written entirely in MicroPython

PycOS An open source operating system designed primarily for the Raspberry Pi Pico, written entirely in MicroPython. "PycOS" is an combination of the

8 Oct 06, 2022
Raspberry Pi & Accelerometer with Losant's EEA

Raspberry Pi & Accelerometer with Losant's EEA This is a repository that contains companion code to this EEA How To guide. Each folder is named accord

Losant 1 Oct 29, 2021
Beam designs for infinite Z 3D printers

A 3D printed beam that is as stiff as steel A while ago Naomi Wu 机械妖姬 very kindly sent us one of Creality's infinite-Z belt printers. Lots of people h

RepRap Ltd 105 Oct 22, 2022