Need help with MASS?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

454 Stars 81 Forks Apache License 2.0 15 Commits 15 Opened issues

Services available


Need anything else?

Contributors list

# 586,120
12 commits
# 757,811
1 commit

MASS(Muscle-Actuated Skeletal System)



This code implements a basic simulation and control for full-body Musculoskeletal system. Skeletal movements are driven by the actuation of the muscles, coordinated by activation levels. Interfacing with python and pytorch, it is available to use Deep Reinforcement Learning(DRL) algorithm such as Proximal Policy Optimization(PPO).


Seunghwan Lee, Kyoungmin Lee, Moonseok Park, and Jehee Lee Scalable Muscle-actuated Human Simulation and Control, ACM Transactions on Graphics (SIGGRAPH 2019), Volume 37, Article 73.

Project Page :

Youtube :

Paper :

How to install

Install TinyXML, Eigen, OpenGL, assimp, Python3, etc...

sudo apt-get install libtinyxml-dev libeigen3-dev libxi-dev libxmu-dev freeglut3-dev libassimp-dev libpython3-dev python3-tk python3-numpy virtualenv ipython3 cmake-curses-gui

Install boost with python3 (1.66)

We strongly recommand that you install boost libraries from the source code (not apt-get, etc...).

  • Download boost sources with the version over 1.66.(

  • Compile and Install the sources

cd /path/to/boost_1_xx/
./ --with-python=python3
sudo ./b2 --with-python --with-filesystem --with-system --with-regex install
  • Check yourself that the libraries are installed well in your directory
    . (or

If installed successfully, you should have something like


  • /usr/local/include/boost/
  • /usr/local/include/boost/python/
  • /usr/local/include/boost/python/numpy


  • /usr/local/lib/
  • /usr/local/lib/
  • /usr/local/lib/

Install DART 6.3

Please refer to (Install version 6.3)

If you are trying to use latest version, rendering codes should be changed according to the version. It is recommended to use the exact 6.3 version.

Manual from DART( 1. install required dependencies

sudo apt-get install build-essential cmake pkg-config git
sudo apt-get install libeigen3-dev libassimp-dev libccd-dev libfcl-dev libboost-regex-dev libboost-system-dev
sudo apt-get install libopenscenegraph-dev
  1. install DART v6.3.0
git clone git://
cd dart
git checkout tags/v6.3.0
mkdir build
cd build
cmake ..
make -j4
sudo make install

Install PIP things

You should first activate virtualenv.

virtualenv /path/to/venv --python=python3
source /path/to/venv/bin/activate
- pytorch(
pip3 install 
pip3 install torchvision
  • numpy, matplotlib
pip3 install numpy matplotlib ipython

How to compile and run


Our system require a reference motion to imitate. We provide sample references such as walking, running, and etc...

To learn and simulate, we should provide such a meta data. We provide default meta data in /data/metadata.txt. We parse the text and set the environment. Please note that the learning settings and the test settings should be equal.(metadata.txt should not be changed.)

Compile and Run

mkdir build
cd build
cmake .. 
make -j8
  • Run Training
    cd python
    source /path/to/virtualenv/
    python3 -d ../data/metadata.txt

All the training networks are saved in /nn folder.

  • Run UI

    source /path/to/virtualenv/
    ./render/render ../data/metadata.txt
  • Run Trained data

    source /path/to/virtualenv/
    ./render/render ../data/metadata.txt ../nn/ ../nn/

If you are simulating with the torque-actuated model,

source /path/to/virtualenv/
./render/render ../data/metadata.txt ../nn/

Model Creation & Retargeting (This module is ongoing project.)

This requires Maya and MotionBuilder.

There is a sample model in data/maya folder that I generally use. Currently if you are trying to edit the model, you have to make your own export maya-python code and xml writer so that the simulation code correctly read the musculoskeletal structure. There is also a rig model that is useful to retarget a new motion.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.