DRRN_CVPR17

by tyshiwo

tyshiwo / DRRN_CVPR17

Code for our CVPR'17 paper "Image Super-Resolution via Deep Recursive Residual Network"

209 Stars 87 Forks Last release: Not found 44 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

DRRN

[Paper][Project]

Citation

If you find DRRN useful in your research, please consider citing:

@inproceedings{Tai-DRRN-2017,
  title={Image Super-Resolution via Deep Recursive Residual Network},
  author={Tai, Ying and Yang, Jian and Liu, Xiaoming },
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  year={2017}
}

Other implementation

[DRRN-tensorflow] by LoSealL

[DRRN-pytorch] by yun_yang

[DRRN-pytorch] by yiyang7

Implement adjustable gradient clipping

modify sgdsolver.cpp in yourcaffe_root/src/caffe/solvers/, where we add the following codes in funciton ClipGradients():

Dtype rate = GetLearningRate();

const Dtype clipgradients = this->param.clip_gradients()/rate;

Training

  1. Preparing training/validation data using the files: generatetrainingsetx234/generatetestingsetx234 in "data" folder. "Train_291" folder contains 291 training images and "Set5" folder is a popular benchmark dataset.
  2. We release two DRRN architectures: DRRNB1U920C128 and DRRNB1U2552C128 in "caffefiles" folder. Choose either one to do training. E.g., run ./trainDRRNB1U920C128.sh

Test

  1. Remember to compile the matlab wrapper: make matcaffe, since we use matlab to do testing.
  2. We release two pretrained models: DRRNB1U920C128 and DRRNB1U2552C128 in "model" folder. Choose either one to do testing on benchmark Set5. E.g., run file ./test/DRRNB1U920C128/testDRRNB1U9, the results are stored in "results" folder, with both reconstructed images and PSNR/SSIM/IFCs.

Benchmark results

Quantitative results

PSNR/SSIMs

IFCs

Qualitative results

Scale factor x2

Scale factor x3

Scale factor x4

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.