Need help with tensorflow-fast-style-transfer?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

hwalsuklee
220 Stars 70 Forks Apache License 2.0 46 Commits 12 Opened issues

Description

A simple, concise tensorflow implementation of fast style transfer

Services available

!
?

Need anything else?

Contributors list

# 1,994
variati...
Tensorf...
Python
ocr-rec...
46 commits

Fast Style Transfer

A tensorflow implementation of fast style transfer described in the papers: * Perceptual Losses for Real-Time Style Transfer and Super-Resolution by Johnson * Instance Normalization by Ulyanov

I recommend you to check my previous implementation of A Neural Algorithm of Artistic Style (Neural style) in here, since implementation in here is almost similar to it.

Sample results

All style-images and content-images to produce following sample results are given in

style
and
content
folders.

Chicago

Following results with

--max_size 1024
are obtained from chicago image, which is commonly used in other implementations to show their performance.

Click on result images to see full size images.




Female Knight

The source image is from https://www.artstation.com/artwork/4zXxW

Results were obtained from default setting except

--max_size 1920
.
An image was rendered approximately after 100ms on GTX 980 ti.

Click on result images to see full size images.




Usage

Prerequisites

  1. Tensorflow
  2. Python packages : numpy, scipy, PIL(or Pillow), matplotlib
  3. Pretrained VGG19 file : imagenet-vgg-verydeep-19.mat
          * Please download the file from link above.
          * Save the file under
    pre_trained_model

  4. MSCOCO train2014 DB : train2014.zip
          * Please download the file from link above. (Notice that the file size is over 12GB!!)
          * Extract images to
    train2014
    .

Train

python run_train.py --style 

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.