Need help with Graphonomy?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

Gaoyiminggithub
170 Stars 42 Forks MIT License 18 Commits 14 Opened issues

Description

Graphonomy: Universal Human Parsing via Graph Transfer Learning

Services available

!
?

Need anything else?

Contributors list

Graphonomy: Universal Human Parsing via Graph Transfer Learning

This repository contains the code for the paper:

Graphonomy: Universal Human Parsing via Graph Transfer Learning ,Ke Gong, Yiming Gao, Xiaodan Liang, Xiaohui Shen, Meng Wang, Liang Lin.

Environment and installation

  • Pytorch = 0.4.0
  • torchvision
  • scipy
  • tensorboardX
  • numpy
  • opencv-python
  • matplotlib
  • networkx

you can install above package by using

pip install -r requirements.txt

Getting Started

Data Preparation

  • You need to download the human parsing dataset, prepare the images and store in
    /data/datasets/dataset_name/
    . We recommend to symlink the path to the dataets to
    /data/dataset/
    as follows
# symlink the Pascal-Person-Part dataset for example
ln -s /path_to_Pascal_Person_Part/* data/datasets/pascal/
  • The file structure should look like:
    /Graphonomy
    /data
    /datasets
      /pascal
        /JPEGImages
        /list
        /SegmentationPart
      /CIHP_4w
        /Images
        /lists
        ...  
    
  • The datasets (CIHP & ATR) are available at google drive and baidu drive. And you also need to download the label with flipped. Download cihp_flipped, unzip and store in
    data/datasets/CIHP_4w/
    . Download atr_flip, unzip and store in
    data/datasets/ATR/
    .

Inference

We provide a simply script to get the visualization result on the CIHP dataset using trained models as follows : ```shell

Example of inference

python exp/inference/inference.py \ --loadmodel /pathtoinferencemodel \ --imgpath ./img/messi.jpg \ --outputpath ./img/ \ --outputname /outputfilename ```

Training

Transfer learning

  1. Download the Pascal pretrained model(available soon).
  2. Run the
    sh train_transfer_cihp.sh
    .
  3. The results and models are saved in exp/transfer/run/.
  4. Evaluation and visualization script is eval_cihp.sh. You only need to change the attribute of
    --loadmodel
    before you run it.

Universal training

  1. Download the pretrained model and store in /data/pretrained_model/.
  2. Run the
    sh train_universal.sh
    .
  3. The results and models are saved in exp/universal/run/.

Testing

If you want to evaluate the performance of a pre-trained model on PASCAL-Person-Part or CIHP val/test set, simply run the script:

sh eval_cihp/pascal.sh
. Specify the specific model. And we provide the final model that you can download and store it in /data/pretrained_model/.

Models

Pascal-Person-Part trained model

|Model|Google Cloud|Baidu Yun| |--------|--------------|-----------| |Graphonomy(CIHP)| Download| Available soon|

CIHP trained model

|Model|Google Cloud|Baidu Yun| |--------|--------------|-----------| |Graphonomy(PASCAL)| Download| Available soon|

Universal trained model

|Model|Google Cloud|Baidu Yun| |--------|--------------|-----------| |Universal| Download|Available soon|

Todo:

  • [ ] release pretrained and trained models
  • [ ] update universal eval code&script

Citation

@inproceedings{Gong2019Graphonomy,
author = {Ke Gong and Yiming Gao and Xiaodan Liang and Xiaohui Shen and Meng Wang and Liang Lin},
title = {Graphonomy: Universal Human Parsing via Graph Transfer Learning},
booktitle = {CVPR},
year = {2019},
}

Contact

if you have any questions about this repo, please feel free to contact [email protected].

Related work

  • Self-supervised Structure-sensitive Learning SSL
  • Joint Body Parsing & Pose Estimation Network JPPNet
  • Instance-level Human Parsing via Part Grouping Network PGN

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.