MATLAB Lua Python Shell
Need help with deep-photo-styletransfer?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.
luanfujun

Description

Code and data for paper "Deep Photo Style Transfer": https://arxiv.org/abs/1703.07511

9.7K Stars 1.4K Forks 93 Commits 30 Opened issues

Services available

Need anything else?

deep-photo-styletransfer

Code and data for paper "Deep Photo Style Transfer"

Disclaimer

This software is published for academic and non-commercial use only.

Setup

This code is based on torch. It has been tested on Ubuntu 14.04 LTS.

Dependencies: * Torch (with matio-ffi and loadcaffe) * Matlab or Octave

CUDA backend: * CUDA * cudnn

Download VGG-19:

sh models/download_models.sh

Compile

cuda_utils.cu
(Adjust
PREFIX
and
NVCC_PREFIX
in
makefile
for your machine):
make clean && make

Usage

Quick start

To generate all results (in

examples/
) using the provided scripts, simply run
run('gen_laplacian/gen_laplacian.m')
in Matlab or Octave and then
python gen_all.py
in Python. The final output will be in
examples/final_results/
.

Basic usage

  1. Given input and style images with semantic segmentation masks, put them in
    examples/
    respectively. They will have the following filename form:
    examples/input/in.png
    ,
    examples/style/tar.png
    and
    examples/segmentation/in.png
    ,
    examples/segmentation/tar.png
    ;
  2. Compute the matting Laplacian matrix using
    gen_laplacian/gen_laplacian.m
    in Matlab. The output matrix will have the following filename form:
    gen_laplacian/Input_Laplacian_3x3_1e-7_CSR.mat
    ;

Note: Please make sure that the content image resolution is consistent for Matting Laplacian computation in Matlab and style transfer in Torch, otherwise the result won't be correct.

  1. Run the following script to generate segmented intermediate result:
    th neuralstyle_seg.lua -content_image  -style_image 

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.