by mtli

Code for Photo-Sketching: Inferring Contour Drawings from Images :dog:

245 Stars 56 Forks Last release: Not found Other 8 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

Photo-Sketching: Inferring Contour Drawings from Images


This repo contains the training & testing code for our sketch generator. We also provide a [pre-trained model].

For technical details and the dataset, please refer to the [paper] and the [project page].

Setting up

The code is now updated to use PyTorch 0.4 and runs on Windows, Mac and Linux. For the obsolete version with PyTorch 0.3 (Linux only), please check out the branch pytorch-0.3-obsolete.

Windows users should find the corresponding

files instead of
files mentioned below.

One-line installation (with Conda environments)

conda env create -f environment.yml

Then activate the environment (sketch) and you are ready to go!

See here for more information about conda environments.

Manual installation


for a list of dependencies.

Using the pre-trained model

  • Download the pre-trained model
  • Modify the path in
  • From the repo's root directory, run

It supports a folder of images as input.

Train & test on our contour drawing dataset

  • Download the images and the rendered sketch from the project page
  • Unzip and organize them into the following structure:

    File structure

  • Modify the path in

  • From the repo's root directory, run

    to train the model
  • From the repo's root directory, run

    to test on the val set or the test set (specified by the phase flag)


If you use the code or the data for your research, please cite the paper:

  title={Photo-Sketching: Inferring Contour Drawings from Images},
  author={Li, Mengtian and Lin, Zhe and M\v ech, Radom\'ir and and Yumer, Ersin and Ramanan, Deva},


This code is based on an old version of pix2pix.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.