Code for the paper Task Agnostic Morphology Evolution.

Overview

Task-Agnostic Morphology Optimization

This repository contains code for the paper Task-Agnostic Morphology Evolution by Donald (Joey) Hejna, Pieter Abbeel, and Lerrel Pinto published at ICLR 2021.

The code has been cleaned up to make it easier to use. An older version of the code was made available with the ICLR submission here.

Setup

The code was tested and used on Ubuntu 20.04. Our baseline implementations use taskset, an ubuntu program for setting CPU affinity. You need taskset to run some of the experiments, and the code will fail without it.

Install the conda environment using the provided file via the command conda env create -f environment.yml. Given this project involves only state based RL, the environment does not install CUDA and the code is setup to use CPU. Activate the environment with conda activate morph_opt.

Next, make sure to install the optimal_agents package by running pip install -e . from the github directory. This will use the setup.py file.

The code is built on top of Stable Baselines 3, Pytorch, and Pytorch Geometric. The exact specified version of stable baselines 3 is required.

Running Experiments

Currently, configs for the 2D experiments have been pushed to the repo. I'm working on pushing more config files that form the basis for the experiments run. To run large scale experiments for the publication, we used additional AWS tools.

Evolution experiments can be run using the train_ea.py script found in the scripts directory. Below are example commands for running different morphology evolution algorithms:

python scripts/train_ea.py -p configs/locomotion2d/2d_tame.yaml

python scripts/train_ea.py -p configs/locomotion2d/2d_tamr.yaml

python scripts/train_ea.py -p configs/locomotion2d/2d_nge_no_pruning.yaml

python scripts/train_ea.py -p configs/locomotion2d/2d_nge_pruning.yaml

After running evolution to discover good morphologies, you can evaluate them using PPO via the provided eval configs.

python scripts/train_rl.py -p configs/locomotion2d/2d_eval.yaml

Note that you have to edit the config file to include either the path to the optimized morphology or a predefined type like random2d or cheetah. We evaluate all morphologies across a number of different environments. The provided configuration file runs evaluations for just one.

To better keep track of the experiment names, you can edit the name field in the config files.

By default, experiments are saved to the data directory. This can be changed by providing an output location with the -o flag.

Rendering, Testing, and Plotting

See the test scripts for viewing agents after they have been trained.

For plotting results like those in the paper, use the plotting scripts. Note that to use the plotting scripts correctly, a specific directory structure is required. Details for this can be found in optimal_agents/utils/plotter.py.

Citing

If you use this code. Please cite the paper.

Owner
Joey Hejna
Joey Hejna
The project page of paper: Architecture disentanglement for deep neural networks [ICCV 2021, oral]

This is the project page for the paper: Architecture Disentanglement for Deep Neural Networks, Jie Hu, Liujuan Cao, Tong Tong, Ye Qixiang, ShengChuan

Jie Hu 15 Aug 30, 2022
Single Red Blood Cell Hydrodynamic Traps Via the Generative Design

Rbc-traps-generative-design - The generative design for single red clood cell hydrodynamic traps using GEFEST framework

Natural Systems Simulation Lab 4 Jun 16, 2022
Code for Discriminative Sounding Objects Localization (NeurIPS 2020)

Discriminative Sounding Objects Localization Code for our NeurIPS 2020 paper Discriminative Sounding Objects Localization via Self-supervised Audiovis

51 Dec 11, 2022
Public scripts, services, and configuration for running a smart home K3S network cluster

makerhouse_network Public scripts, services, and configuration for running MakerHouse's home network. This network supports: TODO features here For mo

Scott Martin 1 Jan 15, 2022
A fuzzing framework for SMT solvers

yinyang A fuzzing framework for SMT solvers. Given a set of seed SMT formulas, yinyang generates mutant formulas to stress-test SMT solvers. yinyang c

Project Yin-Yang for SMT Solver Testing 145 Jan 04, 2023
Adversarial vulnerability of powerful near out-of-distribution detection

Adversarial vulnerability of powerful near out-of-distribution detection by Stanislav Fort In this repository we're collecting replications for the ke

Stanislav Fort 9 Aug 30, 2022
ProMP: Proximal Meta-Policy Search

ProMP: Proximal Meta-Policy Search Implementations corresponding to ProMP (Rothfuss et al., 2018). Overall this repository consists of two branches: m

Jonas Rothfuss 212 Dec 20, 2022
A study project using the AA-RMVSNet to reconstruct buildings from multiple images

3d-building-reconstruction This is part of a study project using the AA-RMVSNet to reconstruct buildings from multiple images. Introduction It is exci

17 Oct 17, 2022
A repository for benchmarking neural vocoders by their quality and speed.

License The majority of VocBench is licensed under CC-BY-NC, however portions of the project are available under separate license terms: Wavenet, Para

Meta Research 177 Dec 12, 2022
Go from graph data to a secure and interactive visual graph app in 15 minutes. Batteries-included self-hosting of graph data apps with Streamlit, Graphistry, RAPIDS, and more!

✔️ Linux ✔️ OS X ❌ Windows (#39) Welcome to graph-app-kit Turn your graph data into a secure and interactive visual graph app in 15 minutes! Why This

Graphistry 107 Jan 02, 2023
Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding (AAAI 2020) - PyTorch Implementation

Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding PyTorch implementation for the Scalable Attentive Sentence-Pair Modeling vi

Microsoft 25 Dec 02, 2022
Simple tutorials on Pytorch DDP training

pytorch-distributed-training Distribute Dataparallel (DDP) Training on Pytorch Features Easy to study DDP training You can directly copy this code for

Ren Tianhe 188 Jan 06, 2023
Weight initialization schemes for PyTorch nn.Modules

nninit Weight initialization schemes for PyTorch nn.Modules. This is a port of the popular nninit for Torch7 by @kaixhin. ##Update This repo has been

Alykhan Tejani 69 Jan 26, 2021
Multi-modal Content Creation Model Training Infrastructure including the FACT model (AI Choreographer) implementation.

AI Choreographer: Music Conditioned 3D Dance Generation with AIST++ [ICCV-2021]. Overview This package contains the model implementation and training

Google Research 365 Dec 30, 2022
face2comics by Sxela (Alex Spirin) - face2comics datasets

This is a paired face to comics dataset, which can be used to train pix2pix or similar networks.

Alex 164 Nov 13, 2022
💡 Type hints for Numpy

Type hints with dynamic checks for Numpy! (❒) Installation pip install nptyping (❒) Usage (❒) NDArray nptyping.NDArray lets you define the shape and

Ramon Hagenaars 377 Dec 28, 2022
Layered Neural Atlases for Consistent Video Editing

Layered Neural Atlases for Consistent Video Editing Project Page | Paper This repository contains an implementation for the SIGGRAPH Asia 2021 paper L

Yoni Kasten 353 Dec 27, 2022
Convert Apple NeuralHash model for CSAM Detection to ONNX.

Apple NeuralHash is a perceptual hashing method for images based on neural networks. It can tolerate image resize and compression.

Asuhariet Ygvar 1.5k Dec 31, 2022
Prediction of MBA refinance Index (Mortgage prepayment)

Prediction of MBA refinance Index (Mortgage prepayment) Deep Neural Network based Model The ability to predict mortgage prepayment is of critical use

Ruchil Barya 1 Jan 16, 2022
DeceFL: A Principled Decentralized Federated Learning Framework

DeceFL: A Principled Decentralized Federated Learning Framework This repository comprises codes that reproduce experiments in Ye, et al (2021), which

Huazhong Artificial Intelligence Lab (HAIL) 10 May 31, 2022