Skip to content

PhilipZRH/ferm

Repository files navigation

Contrastive Pre-training and Data Augmentation for Efficient Robotic Learning (CoDER),

CoDER is a method that enables robots to learn tasks within an hour of real time training.

Paper: arXiv link

Project Page: https://sites.google.com/view/efficient-robotic-manipulation.

The project and this codebase are joint work by Albert Zhan*, Ruihan (Philip) Zhao*, Lerrel Pinto, Pieter Abbeel, Misha Laskin. The implementation is based off of RAD.

Getting Started

Create a conda environment, and install the necessary packages.

conda create -n ferm python=3.7
pip install -r requirements.txt

Running Experiments

Sample scripts are included in the scripts folder. This includes training, evaluating, as well as behavior cloning baselines. To launch the experiments, navigate the project root folder and run

./scripts/script_name.sh

Robotic Experiments

To run robotic experiments, create your gym environment interface with your robotic setup, and substitute the --domain_name flag with your registered environment name.

Using Demonstrations

Real world demonstrations

To use demonstrations, save the (obs, next_obs, actions, rewards, not_dones) demonstration tuple (a tuple of X length lists) into 0_X.pt, where X is the number of entries saved. Include the --replay_buffer_load_dir=work_directory_path/0_X.pt

Sim demonstrations

Our sim experiments use large amounts of demonstrations, which are generated on the fly through an expert policy that uses state input. Include the tags --demo_model_dir=path_to_expert --demo_model_step=X, where the expert policy is saved as path_to_expert/model/actor_X.pt and path_to_expert/model/critic_X.pt.

Citation

@article{zhan2020framework,
  title={A Framework for Efficient Robotic Manipulation},
  author={Zhan, Albert and Zhao, Philip and Pinto, Lerrel and Abbeel, Pieter and Laskin, Michael},
  journal={arXiv preprint arXiv:2012.07975},
  year={2020}
}

About

FERM: A Framework for Efficient Robotic Manipulation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published