Developing and Comparing Vision-based Algorithms for Vision-based Agile Flight

Overview

DodgeDrone: Vision-based Agile Drone Flight (ICRA 2022 Competition)

IMAGE ALT TEXT HERE

Would you like to push the boundaries of drone navigation? Then participate in the dodgedrone competition! You will get the chance to develop perception and control algorithms to navigate a drone in both static and dynamic environments. Competing in the challenge will deepen your expertise in computer vision and control, and boost your research. You can find more information at the competition website.

This codebase provides the following functionalities:

  1. A simple high-level API to evaluate your navigation policy in the Robot Operating System (ROS). This is completely independent on how you develop your algorithm.
  2. Training utilities to use reinforcement learning for the task of high-speed obstacle avoidance.

All evaluation during the competition will be performed using the same ROS evaluation, but on previously unseen environments / obstacle configurations.

Submission

  • 06 May 2022 Submission is open. Please submit your version of the file user_code.py with all needed dependencies with an email to loquercio AT berkeley DOT edu. Please use as subject ICRA 2022 Competition: Team Name. If you have specific dependencies, please provide instructions on how to install them. Feel free to switch from python to cpp if you want.

Further Details

  • We will only evaluate on the warehouse environment with spheres obstacles.
  • If you're using vision, you are free to use any sensor you like (depth, optical flow, RGB). The code has to run real-time on a desktop with 16 Intel Core i7-6900K and an NVIDIA Titan Xp.
  • If you're using vision, feel free to optimize the camera parameters for performance (e.g. field of view).
  • We will two rankings, one for vision-based and another for state-based. The top three team for each category will qualify for the finals.

Update

  • 02 May 2022 Fix a bug in the vision racing environment when computing reward function. No need to update if you are not using RL or if you have change the reward formualtion. Related to this issue #65

  • 27 March 2022 Fix a static object rendering issue. Please download the new Unity Standalone using this. Also, git pull the project.

Flight API

This library contains the core of our testing API. It will be used for evaluating all submitted policies. The API is completely independent on how you build your navigation system. You could either use our reinforcement learning interface (more on this below) or add your favourite navigation system.

Prerequisite

Before continuing, make sure to have g++ and gcc to version 9.3.0. You can check this by typing in a terminal gcc --version and g++ --version. Follow this guide if your compiler is not compatible.

In addition, make sure to have ROS installed. Follow this guide and install ROS Noetic if you don't already have it.

Installation (for ROS User)

We only support Ubuntu 20.04 with ROS noetic. Other setups are likely to work as well but not actively supported.

Start by creating a new catkin workspace.

cd     # or wherever you'd like to install this code
export ROS_VERSION=noetic
export CATKIN_WS=./icra22_competition_ws
mkdir -p $CATKIN_WS/src
cd $CATKIN_WS
catkin init
catkin config --extend /opt/ros/$ROS_VERSION
catkin config --merge-devel
catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release -DCMAKE_CXX_FLAGS=-fdiagnostics-color

cd src
git clone [email protected]:uzh-rpg/agile_flight.git
cd agile_flight

Run the setup_ros.bash in the main folder of this repository, it will ask for sudo permissions. Then build the packages.

./setup_ros.bash

catkin build

Installation (for Python User)

If you want to develop algorithms using only Python, especially reinforcement learning, you need to install our library as python package.

Make sure that you have anaconda installed. This is highly recommanded.

Run the setup_py.bash in the main folder of this repository, it will ask for sudo permissions.

./setup_py.bash

Task

The task is to control a simulated quadrotor to fly through obstacle dense environments. The environment contains both static and dynamic obstacles. You can specifiy which difficulty level and environment you want to load for testing your algorithm. The yaml configuration file is located in this file. The goal is to proceed as fast as possible 60m in positive x-direction without colliding into obstacles and exiting a pre-defined bounding box. The parameters of the goal location and the bounding box can be found here.

environment:
  level: "medium" # three difficulty level for obstacle configurations [easy, medium, hard]
  env_folder: "environment_0" # configurations for dynamic and static obstacles, environment number are between [0 - 100]

unity:
  scene_id: 0 # 0 warehouse, 1 garage, 2 natureforest, 3 wasteland

Usage

The usage of this code base entails two main aspects: writing your algorithm and testing it in the simulator.

Writing your algorithm:

To facilitate coding of your algorithms, we provided a simple code structure for you, just edit the following file: envtest/ros/user_code.py. This file contains two functions, compute_command_vision_based and compute_command_state_based. In the vision-based case, you will get the current image and state of the quadrotor. In the state-based case, you will get the metric distance to obstacles and the state of the quadrotor. We strongly reccomend using the state-based version to start with, it is going to be much easier than working with pixels!

Depending on the part of the competition you are interested in, adapt the corresponding function. To immediately see something moving, both functions at the moment publish a command to fly straight forward, of course without avoiding any obstacles. Note that we provide three different control modes for you, ordered with increasing level of abstraction: commanding individual single-rotor thrusts (SRT), specifying mas-normalized collective thrust and bodyrates (CTBR), and outputting linear velocity commands and yawrate (LINVEL). The choice of control modality is up to you. Overall, the more low-level you go, the more difficult is going to be to mantain stability, but the more agile your drone will be.

Testing your approach in the simulator:

Make sure you have completed the installation of the flight API before continuing. To use the competition software, three steps are required:

  1. Start the simulator

    roslaunch envsim visionenv_sim.launch render:=True
    # Using the GUI, press Arm & Start to take off.
    python evaluation_node.py
    

    The evaluation node comes with a config file. There, the options to plot the results can be disabled if you want no plots.

  2. Start your user code. This code will generate control commands based on the sensory observations. You can toggle vision-based operation by providing the argument --vision_based.

    cd envtest/ros
    python run_competition.py [--vision_based]
    
  3. Tell your code to start! Until you publish this message, your code will run but the commands will not be executed. We use this to ensure fair comparison between approaches as code startup times can vary, especially for learning-based approaches.

    rostopic pub /kingfisher/start_navigation std_msgs/Empty "{}" -1
    

If you want to perform steps 1-3 automatically, you can use the launch_evaluation.bash N script provided in this folder. It will automatically perform N rollouts and then create an evaluation.yaml file which summarizes the rollout statistics.

Using Reinforcement Learning (Optional) We provide an easy interface for training your navigation policy using reinforcement learning. While this is not required for the competition, it could just make your job easier if you plan on using RL.

Follow this guide to know more about how to use the training code and some tips on how to develop reinforcement learning algorithms

Owner
Robotics and Perception Group
Robotics and Perception Group
Poetry workspace plugin for Python monorepos.

poetry-workspace-plugin Poetry workspace plugin for Python monorepos. Inspired by Yarn Workspaces. Adds a new subcommand group, poetry workspace, whic

Jack Smith 74 Jan 01, 2023
CountBoard 是一个基于Tkinter简单的,开源的桌面日程倒计时应用。

CountBoard 是一个基于Tkinter简单的,开源的桌面日程倒计时应用。 基本功能 置顶功能 是否使窗体一直保持在最上面。 简洁模式 简洁模式使窗体更加简洁。 此模式下不可调整大小,请提前在普通模式下调整大小。 设置功能 修改主窗体背景颜色,修改计时模式。 透明设置 调整窗体的透明度。 修改

gaoyongxian 130 Dec 01, 2022
Automation of VASP DFT workflows with ASE - application scripts

This repo contains a library that aims at automatizing some Density Functional Theory (DFT) workflows in VASP by using the ASE toolkit.

Frank Niessen 5 Sep 06, 2022
thonny plugin for gitonic

thonny-gitonic thonny plugin for gitonic open gitonic in thonny by pressing Control+Shift+g, or via tools menu press ESC key to minimize gitonic windo

karl 1 Apr 12, 2022
Group P-11's submission for the University of Waterloo's 2021 Engineering Competition (Programming section).

P-11-WEC2021 Group P-11's submission for the University of Waterloo's 2021 Engineering Competition (Programming section). Part I Compute typing time f

TRISTAN PARRY 1 May 14, 2022
PwnDatas-DB-Project(PDDP)

PwnDatas-DB-Project PwnDatas-DB-Project(PDDP) 安裝依賴: pip3 install pymediawiki 使用: cd /opt git https://github.com/JustYoomoon/PwnDatas-DB-Project.git c

21 Jul 16, 2021
Easy, clean, reliable Python 2/3 compatibility

Overview: Easy, clean, reliable Python 2/3 compatibility python-future is the missing compatibility layer between Python 2 and Python 3. It allows you

Python Charmers 1.2k Jan 08, 2023
A set of simple functions to upload and fetch pastes on paste.uploadgram.me

pastegram-py A set of simple functions to upload and fetch pastes on paste.uploadgram.me. API Documentation Methods upload_paste(contents: bytes, file

Uploadgram 3 Sep 13, 2022
Simple calculator made in python

calculator Uma alculadora simples feita em python CMD, PowerShell, Bash ✔️ Início 💻 apt-get update apt-get upgrade -y apt-get install python git git

Spyware 8 Dec 28, 2021
Patch PL to disable LK verification. Patch LK to disable boot/recovery verification.

Simple Python(3) script to disable LK verification in Amazon Preloader images and boot/recovery image verification in Amazon LK ("Little Kernel") images.

Roger Ortiz 18 Mar 17, 2022
MeerKAT radio telescope simulation package. Built to simulate multibeam antenna data.

MeerKATgen MeerKAT radio telescope simulation package. Designed with performance in mind and utilizes Just in time compile (JIT) and XLA backed vectro

Peter Ma 6 Jan 23, 2022
Repositório do Projeto de Jogo da Resília Educação.

Jogo da Segurança das Indústrias Acme Descrição Este jogo faz parte do projeto de entrega do primeiro módulo da Resilia Educação, referente ao curso d

Márcio Estevam da Silva 2 Apr 28, 2022
Pokehandy - Data web app sobre Pokémon TCG que desarrollo durante transmisiones de Twitch, 2022

⚡️ Pokéhandy – Pokémon Hand Simulator [WIP 🚧 ] This application aims to simulat

Rodolfo Ferro 5 Feb 23, 2022
Todo-backend - Todo backend with python

Todo-backend - Todo backend with python

Julio C. Diaz 1 Jan 07, 2022
💉 🔍 VaxFinder - Backend The backend for the Vaccine Hunters Finder tool.

💉 🔍 VaxFinder - Backend The backend for the Vaccine Hunters Finder tool. Development Prerequisites Python 3.8 Poetry: A tool for dependency manageme

Vaccine Hunters Canada 32 Jan 19, 2022
Python communism - A module for initiating the communist revolution in each of our python modules

Python communist revolution A man once said to abolish the classes or something

758 Jan 03, 2023
Understanding the field usage of any object in Salesforce

Understanding the field usage of any object in Salesforce One of the biggest problems that I have addressed while working with Salesforce is to unders

Sebastian Undurraga 1 Dec 14, 2021
Pardus-flatpak-gui - A Flatpak GUI for Pardus

Pardus Flatpak GUI A GUI for Flatpak. You can run, install (from FlatHub and fro

Erdem Ersoy 2 Feb 17, 2022
Blender 3.0 Python - Open temporary areas in the Text Editor

PopDrawers When editing text in Blender, it can be handy to have areas like Info, Console, Outliner, etc visible on screen to help with scripting. How

SpectralVectors 7 Nov 16, 2022
APRS Track Direct is a collection of tools that can be used to run an APRS website

APRS Track Direct APRS Track Direct is a collection of tools that can be used to run an APRS website. You can use data from APRS-IS, CWOP-IS, OGN, HUB

Per Qvarforth 42 Dec 29, 2022